ISSN 0869-6632 (Print)
ISSN 2542-1905 (Online)

For citation:

Zemlyannikov A. S., Sysoev I. V. Diagnostics and correction of systematic error while estimating transfer entropy with k-nearest neighbours method. Izvestiya VUZ. Applied Nonlinear Dynamics, 2015, vol. 23, iss. 4, pp. 24-31. DOI: 10.18500/0869-6632-2015-23-4-24-31

This is an open access article distributed under the terms of Creative Commons Attribution 4.0 International License (CC-BY 4.0).
Full text PDF(Ru):
(downloads: 256)
Article type: 
530.182, 51-73

Diagnostics and correction of systematic error while estimating transfer entropy with k-nearest neighbours method

Zemlyannikov Andrey Sergeevich, Saratov State University
Sysoev Ilya Vyacheslavovich, Saratov State University

Transfer entropy is widely used to detect the directed coupling in oscillatory systems from their observed time series. The systematic error is detected, while estimating transfer entropy between nonlinear systems with K-nearest neighbours method. The way to minimize this error is suggested: the error is decreasing with increase of the neighbour number. The possibility to detect the systematic error is shown using two sets of measured data. The achieved results make possible to rise the method sensitivity and specificity for weakly coupled nonlinear systems.  

  1. Schreiber T. Measuring information transfer // Phys. Rev. Lett. 2000. Vol. 85, № 2. P. 461.
  2. Moddemeijer R. On estimation of entropy and mutual information of continuous distributions // Signal Processing. 1989. Vol. 16, № 3. P. 233.
  3. Lee J., Nemati S., Silva I., Edwards B.-A., Butler J.-P., Malhotra A. Transfer entropy estimation and directional coupling change detection in biomedical time series // BioMedical Engineering OnLine. 2012. 11:19.
  4. Silverman B. Density estimation for statistics and data analysis. London: Chapman and Hall, 1986. 175 p.
  5. Darbellay A.G., Vajda I. Estimation of the information by an adaptive partitioning of the observation space // IEEE Transactions on Information Theory. 1999. Vol. 45, № 4. P. 1315.
  6. Kugiumtzis D. Transfer entropy on rank vectors // Journal of Nonlinear Systems and Applications. 2012. Vol. 3, № 2. P. 73.
  7. Kraskov A., Stogbauer H., Grassberger P. Estimating mutual information // Phys. Rev. E. 2004. 69: 66138.
  8. Jizba P., Kleinert H., Shefaat M. Renyi’s information transfer between financial time series // Physica A. 2012. Vol. 391. P. 2971.
  9. Gomez-Herrero G., Wu W., Rutanen K., Soriano M.C., Pipa G., Vicente R. Assessing coupling dynamics from an ensemble of time series // Arxiv preprint arXiv:1008.0539v1. 2010.
  10. Kaiser A., Schreiber T. Information transfer in continuous process // Physica D: Nonlinear Phenomena. 2002. Vol. 166, № 1–2.
  11. Hahs D.W., Pethel S.D. Transfer entropy for coupled autoregressive processes // Entropy. 2003. Vol. 15(3). P. 767.
  12. Lindner M., Vicente R., Priesemann V., Wibral M. TRENTOOL: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy // BMC Neuroscience. 2011. 12:119.
  13. Wibral M., Pampu N., Priesemann V., Siebenhuhner F., Seiwert H., Lindner M., Lizier J.T., Vicente R. Measuring information-transfer delays // PLoS One. 2013. Vol. 8(2):e55809.
  14. Smirnov D.A. Spurious causalities with transfer entropy // Phys. Rev. E. 2013. Vol. 87. 042917.
  15. Kozachenko L.F., Leonenko N.N. // Probl. Inf. Transm. 1987. Vol. 23. P. 95.
  16. Kuznetsov S.P. Dynamical chaos. M.: Fizmatlit, 2001. 296 s. (In Russian).  
Short text (in English):
(downloads: 141)