ISSN 0869-6632 (Print)
ISSN 2542-1905 (Online)


For citation:

Sysoev I. V. Comparison of numerical realisation of algorithm of mutual information calculation based on nearest neighbours. Izvestiya VUZ. Applied Nonlinear Dynamics, 2016, vol. 24, iss. 4, pp. 86-95. DOI: 10.18500/0869-6632-2016-24-4-86-95

This is an open access article distributed under the terms of Creative Commons Attribution 4.0 International License (CC-BY 4.0).
Full text PDF(Ru):
(downloads: 106)
Language: 
Russian
Article type: 
Article
UDC: 
517.98.537

Comparison of numerical realisation of algorithm of mutual information calculation based on nearest neighbours

Autors: 
Sysoev Ilya Vyacheslavovich, Saratov State University
Abstract: 

Purpose. To compare effeciency of different realizations of approaches to estimation of mutual information function based on nearest neighbours. Method. Two approaches to calculation of mutual information function were realized numerically: straightforward approach is based on brute force, and sorting based one. Results. The algorithmic complexity of sorting beased method was shown to be less than of straightforward approach, but larger than the complexity of any quick sort method. Discussion. Realization of sorting based method is reasonable in the case, when one has to deal with long samplings, while for small samplings the straightforward approach is enough.   

Reference: 
  1. Fraser A.M., Swinney H.L. Independent coordinates for strange attractors from mutual information // Phys. Rev. A. 1986. Vol. 33. 1134. 
  2. Wells W.M., Viola P., Atsumi H., Nakajima Sh., Kikinis R. Multi-modal volume registration by maximization of mutual information // Medical Image Analysis. 1996. Vol. 1. Iss. 1. Pp. 35–51.
  3. Pluim J.P.W., Maintz J.B.A., Viergever M.A. Mutual-information-based registration of medical images: a survey // IEEE Transac. on Medical Imaging, 2003. Vol. 22, Iss. 8. Pp. 986–1004.
  4. Paninski L. Estimation of Entropy and Mutual Information // Neural Computation. 2003. Vol. 15. Pp. 1191–1253.
  5. Church K.W., Hanks P. Word association norms, mutual information, and lexicography // Computational Linguistics. 1990. Vol. 16, Iss. 1. Pp. 22–29.
  6. Moddemeijer R. On estimation of entropy and mutual information of continuous distributions // Signal Processing. 1989. Vol. 16, Iss. 3. Pp. 233–248.
  7. Ai-Hua Jiang, Xiu-Chang Huang, Zhen-Hua Zhang, Jun Li, Zhi-Yi Zhang, Hong-Xin Hua. Mutual information algorithms // Mechanical Systems and Signal Processing. 2010. Vol. 24. Pp. 2947–2960.
  8. Il-Moon Yo., Rajagopalan B., Lall U. // Estimation of mutual information using kernel density estimators // Phys. Rev. E. 1995. Vol. 52(3). Pp. 2318–2321.
  9. Kraskov A., Stogbauer H., Grassberger P. Estimating mutual information // Phys. Rev. E. 2004. Vol. 69. 066138.
  10. Abramowitz M., Stegun I.A., eds. Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables (10th ed.). New York: Dover. 1972. Pp. 258–259.
  11. Schreiber Th. Measuring Information Transfer // Phys. Rev. Lett. 2000. Vol. 85. Pp. 461–464.
  12. McGill W.J. Multivariate information transmission // Psychometrika. 1954. Vol. 19. Pp. 97–116.
Received: 
10.08.2016
Accepted: 
31.08.2016
Published: 
31.08.2016
Short text (in English):
(downloads: 93)