For citation:
Sysoev I. V. Comparison of numerical realisation of algorithm of mutual information calculation based on nearest neighbours. Izvestiya VUZ. Applied Nonlinear Dynamics, 2016, vol. 24, iss. 4, pp. 86-95. DOI: 10.18500/0869-6632-2016-24-4-86-95
Comparison of numerical realisation of algorithm of mutual information calculation based on nearest neighbours
Purpose. To compare effeciency of different realizations of approaches to estimation of mutual information function based on nearest neighbours. Method. Two approaches to calculation of mutual information function were realized numerically: straightforward approach is based on brute force, and sorting based one. Results. The algorithmic complexity of sorting beased method was shown to be less than of straightforward approach, but larger than the complexity of any quick sort method. Discussion. Realization of sorting based method is reasonable in the case, when one has to deal with long samplings, while for small samplings the straightforward approach is enough.
- Fraser A.M., Swinney H.L. Independent coordinates for strange attractors from mutual information // Phys. Rev. A. 1986. Vol. 33. 1134.
- Wells W.M., Viola P., Atsumi H., Nakajima Sh., Kikinis R. Multi-modal volume registration by maximization of mutual information // Medical Image Analysis. 1996. Vol. 1. Iss. 1. Pp. 35–51.
- Pluim J.P.W., Maintz J.B.A., Viergever M.A. Mutual-information-based registration of medical images: a survey // IEEE Transac. on Medical Imaging, 2003. Vol. 22, Iss. 8. Pp. 986–1004.
- Paninski L. Estimation of Entropy and Mutual Information // Neural Computation. 2003. Vol. 15. Pp. 1191–1253.
- Church K.W., Hanks P. Word association norms, mutual information, and lexicography // Computational Linguistics. 1990. Vol. 16, Iss. 1. Pp. 22–29.
- Moddemeijer R. On estimation of entropy and mutual information of continuous distributions // Signal Processing. 1989. Vol. 16, Iss. 3. Pp. 233–248.
- Ai-Hua Jiang, Xiu-Chang Huang, Zhen-Hua Zhang, Jun Li, Zhi-Yi Zhang, Hong-Xin Hua. Mutual information algorithms // Mechanical Systems and Signal Processing. 2010. Vol. 24. Pp. 2947–2960.
- Il-Moon Yo., Rajagopalan B., Lall U. // Estimation of mutual information using kernel density estimators // Phys. Rev. E. 1995. Vol. 52(3). Pp. 2318–2321.
- Kraskov A., Stogbauer H., Grassberger P. Estimating mutual information // Phys. Rev. E. 2004. Vol. 69. 066138.
- Abramowitz M., Stegun I.A., eds. Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables (10th ed.). New York: Dover. 1972. Pp. 258–259.
- Schreiber Th. Measuring Information Transfer // Phys. Rev. Lett. 2000. Vol. 85. Pp. 461–464.
- McGill W.J. Multivariate information transmission // Psychometrika. 1954. Vol. 19. Pp. 97–116.
- 1789 reads