ISSN 0869-6632 (Print)
ISSN 2542-1905 (Online)


For citation:

Pugavko M. M., Maslennikov O. V., Nekorkin V. I. Dynamics of a network of map-based model neurons for supervised learning of a reservoir computing system. Izvestiya VUZ. Applied Nonlinear Dynamics, 2020, vol. 28, iss. 1, pp. 77-89. DOI: 10.18500/0869-6632-2020-28-1-77-89

This is an open access article distributed under the terms of Creative Commons Attribution 4.0 International License (CC-BY 4.0).
Full text PDF(Ru):
(downloads: 373)
Language: 
Russian
Article type: 
Article
UDC: 
530.182

Dynamics of a network of map-based model neurons for supervised learning of a reservoir computing system

Autors: 
Pugavko M M, Institute of Applied Physics of the Russian Academy of Sciences
Maslennikov O.  V., Institute of Applied Physics of the Russian Academy of Sciences
Nekorkin Vladimir Isaakovich, Institute of Applied Physics of the Russian Academy of Sciences
Abstract: 

The purpose of this work is to develop a reservoir computing system that contains a network of model neurons with discrete time, and to study the characteristics of the system when it is trained to autonomously generate a harmonic target signal. Methods of work include approaches of nonlinear dynamics (phase space analysis depending on parameters), machine learning (reservoir computing, supervised error minimization) and computer modeling (implementation of numerical algorithms, plotting of characteristics and diagrams). Results. A reservoir computing system based on a network of coupled discrete model neurons was constructed, and the possibility of its supervised training in generating the target signal using the controlled error minimization method FORCE was demonstrated. It has been found that with increasing network size, the mean square error of learning decreases. The dynamic regimes arising at the level of individual activity of intra-reservoir neurons at various stages of training are studied. It is shown that in the process of training, the network-reservoir transits from the state of space-time disorder to the state with regular clusters of spiking activity. The optimal values of the coupling coefficients and the parameters of the intrinsic dynamics of neurons corresponding to the minimum learning error were found. Conclusion. A new reservoir computing system is proposed in the work, the basic unit of which is the Courbage–Nekorkin discrete-time model neuron. The advantage of a network based on such a spiking neuron model is that the model is specified in the form of a mapping, therefore, there is no need to perform an integration operation. The proposed system has shown its effectiveness in training autonomous generation of a harmonic function, as well as for a number of other target functions.

Reference: 
  1. Schmidhuber J. Deep learning in neural networks: An overview // Neural Networks. 2015. Vol. 61. P. 85–117.
  2. LeCun Y., Bengio Y., Hinton G. Deep learning // Nature. 2015. Vol. 521, № 7553. P. 436.
  3. Sporns O. Structure and function of complex brain networks // Dialogues in Clinical Neuroscience. 2013. Vol. 15, № 3. P. 247.
  4. Dmitrichev A.S., Kasatkin D.V., Klinshov V.V., Kirillov S.Y., Maslennikov O.V., Shapin D.S., Nekorkin V.I. Nonlinear dynamical models of neurons. Izvestiya VUZ. Applied Nonlinear Dynamics, 2018, vol. 26, no. 4, pp. 5–58 (in Russian).
  5. Marblestone A.H., Wayne G., Kording K.P. Toward an integration of deep learning and neuroscience // Frontiers in Computational Neuroscience. 2016. Vol. 10. P. 94.
  6. Laje R., Buonomano D.V. Robust timing and motor patterns by taming chaos in recurrent neural networks // Nature Neuroscience. 2013. Vol. 16, №7. P. 925.
  7. Barak O. Recurrent neural networks as versatile tools of neuroscience research // Current Opinion in Neurobiology. 2017. Vol. 46. P. 1–6.
  8. Kim C.M., Chow C.C. Learning recurrent dynamics in spiking networks // eLife. 2018. Vol. 7. e37124.
  9. Abbott L.F., DePasquale B., Memmesheimer R.-M. Building functional networks of spiking model neurons // Nature Neuroscience. 2016. Vol. 19, № 3. P. 350.
  10. Nicola W., Clopath C. Supervised learning in spiking neural networks with FORCE training // Nature Communications. 2017. Vol. 8, № 1. P. 2208.
  11. Lukosevicius M., Jaeger H. Reservoir computing approaches to recurrent neural network training // Computer Science Review. 2009. Vol. 3, № 3. P. 127–149.
  12. Enel P., Procyk E., Quilodran R., Dominey P.F. Reservoir computing properties of neural dynamics in prefrontal cortex // PLoS Computational Biology. 2016. Vol. 12, № 6. e1004967.
  13. Schrauwen B., Verstraeten D., Van Campenhout J. An overview of reservoir computing: Theory, applications and implementations // Proceedings of the 15th European Symposium on Artificial Neural Networks. 2007. P. 471–482.
  14. Sussillo D., Abbott L.F. Generating coherent patterns of activity from chaotic neural networks // Neuron. 2009. Vol. 63, № 4. P. 544–557.
  15. Sussillo D., Barak O. Opening the black box: Low-dimensional dynamics in high-dimensional recurrent neural networks // Neural Computation. 2013. Vol. 25, № 3. P. 626–649.
  16. Sussillo D. Neural circuits as computational dynamical systems // Current Opinion in Neurobiology. 2014. Vol. 25. P. 156–163.
  17. Maslennikov O.V., Nekorkin V.I. Collective dynamics of rate neurons for supervised learning in a reservoir computing system // Chaos. 2019. Vol. 29, no. 10. P. 103126.
  18. Nekorkin V.I., Vdovin L.V. Discrete model of neural activity. Izvestiya VUZ. Applied Nonlinear Dynamics, 2007, vol. 15, no. 5, pp. 36–60 (in Russian).
  19. Courbage M., Nekorkin V.I., Vdovin L.V. Chaotic oscillations in a map-based model of neural activity // Chaos: An Interdisciplinary Journal of Nonlinear Science. 2007. Vol. 17, no. 4. 043109.
  20. Maslennikov O.V., Nekorkin V.I. Evolving dynamical networks with transient cluster activity // Communications in Nonlinear Science and Numerical Simulation. 2015. V. 23, no. 1–3. P. 10–16.
  21. Maslennikov O.V., Nekorkin V.I. Transient sequences in a hypernetwork generated by an adaptive network of spiking neurons // Philosophical Transactions of the Royal Society A. 2017. Vol. 375, no. 2096. P. 20160288.
  22. Maslennikov O.V., Nekorkin V.I. Adaptive dynamical networks. Physics-Uspekhi, 2017, vol. 60, no. 7, p. 694
  23. Maslennikov O.V., Nekorkin V.I., Kurths J. Basin stability for burst synchronization in smallworld networks of chaotic slow-fast oscillators // Physical Review E. 2015. Vol. 92, no. 4. 042803.
  24. Courbage M., Maslennikov O.V., Nekorkin V.I. Synchronization in time-discrete model of two electrically coupled spike-bursting neurons // Chaos, Solitons & Fractals. 2012. Vol. 45, no. 5. P. 645–659.
  25. Maslennikov O.V., Nekorkin V.I. Modular networks with delayed coupling: Synchronization and frequency control // Physical Review E. 2014. Vol. 90, no. 1. 012901.
  26. Maslennikov O.V., Nekorkin V.I.. Discrete model of the olivo-cerebellar system: Structure and dynamics // Radiophysics and Quantum Electronics. 2012. Vol. 55, no. 3. P. 198–214.
  27. Maslennikov O.V., Kasatkin D.V., Rulkov N.F., Nekorkin V.I. Emergence of antiphase bursting in two populations of randomly spiking elements // Physical Review E. 2013. Vol. 88, no. 4. 042907.
  28. Franovic I., Maslennikov O.V., Bacic I., Nekorkin V.I. Mean-field dynamics of a population of stochastic map neurons // Physical Review E. 2017. Vol. 96, no. 1. 012226.
Received: 
23.09.2019
Accepted: 
05.11.2019
Published: 
26.02.2020