ISSN 0869-6632 (Print)
ISSN 2542-1905 (Online)


For citation:

Maslennikov O. V. Dynamics of an artificial recurrent neural network for the problem of modeling a cognitive function. Izvestiya VUZ. Applied Nonlinear Dynamics, 2021, vol. 29, iss. 5, pp. 799-811. DOI: 10.18500/0869-6632-2021-29-5-799-811

This is an open access article distributed under the terms of Creative Commons Attribution 4.0 International License (CC-BY 4.0).
Full text:
(downloads: 247)
Language: 
Russian
Article type: 
Article
UDC: 
530.182

Dynamics of an artificial recurrent neural network for the problem of modeling a cognitive function

Autors: 
Maslennikov O.  V., Institute of Applied Physics of the Russian Academy of Sciences
Abstract: 

The purpose of this work is to build an artificial recurrent neural network whose activity models a cognitive function relating to the comparison of two vibrotactile stimuli coming with a delay and to analyze dynamic mechanisms underlying its work. Methods of the work are machine learning, analysis of spatiotemporal dynamics and phase space. Results. Activity of the trained recurrent neural network models a cognitive function of the comparison of two stimuli with a delay. Model neurons exhibit mixed selectivity during the course of the task. In the multidimensional activity, the components are found each of which depends on a certain task parameter. Conclusion. The training of the artificial neural network to perform the funciton analogous to the experimentally observed process is accompanied by the emergence of dynamic properties of model neurons which are similar to those found in the experiment. 

Acknowledgments: 
Design of the model was carried out within the framework of the Program for the Development of the Regional Scientific and Educational Mathematical Center “Mathematics of Future Technologies”, project 075-02-2020-1483/1. Analysis of the dynamics was supported by the Russian Science Foundation (grant No 19-72-00112)
Reference: 
  1. Dmitrichev AS, Kasatkin DV, Klinshov VV, Kirillov SY, Maslennikov OV, Shchapin DS, Nekorkin VI. Nonlinear dynamical models of neurons: Review. Izvestiya VUZ. Applied Nonlinear Dynamics. 2018;26(4):5–58 (in Russian). DOI:10.18500/0869-6632-2018-26-4-5-58.
  2. Nekorkin VI. Nonlinear oscillations and waves in neurodynamics. Phys. Usp. 2008;51(3): 295–304. DOI:10.1070/PU2008v051n03ABEH006493.
  3. Ehrlich DB, Stone JT, Brandfonbrener D, Atanasov A, Murray JD. PsychRNN: An accessible and flexible python package for training recurrent neural network models on cognitive tasks. eNeuro. 2021;8(1):ENEURO.0427–20.2020. DOI:10.1523/ENEURO.0427-20.2020.
  4. Richards BA, Lillicrap TP, Beaudoin P, Bengio Y, Bogacz R, Christensen A, Clopath C, Costa RP, de Berker A, Ganguli S, Gillon CJ, Hafner D, Kepecs A, Kriegeskorte N, Latham P, Lindsay G, Miller K, Naud R, Pack CC, Poirazi P, Roelfsema P, Sacramento J, Saxe A, Scellier B, Schapiro AC, Senn W, Wayne G, Yamins D, Zenke F, Zylberberg J, Therien D, Kording KP. A deep learning framework for neuroscience. Nature Neuroscience. 2019;22(11):1761–1770. DOI:10.1038/s41593-019-0520-2.
  5. Barak O. Recurrent neural networks as versatile tools of neuroscience research. Current Opinion in Neurobiology. 2017;46:1–6. DOI:10.1016/j.conb.2017.06.003.
  6. Marblestone AH, Wayne G, Kording KP. Toward an integration of deep learning and neuroscience. Frontiers in Computational Neuroscience. 2016;10:94. DOI:10.3389/fncom.2016.00094.
  7. Sussillo D. Neural circuits as computational dynamical systems. Current Opinion in Neurobiology. 2014;25:156–163. DOI:10.1016/j.conb.2014.01.008.
  8. Maslennikov OV, Pugavko MM, Shchapin DS, Nekorkin VI. Nonlinear dynamics and machine learning of recurrent spiking neural networks. Phys. Usp. Accepted. DOI:10.3367/UFNe.2021.08.039042.
  9. Schmidhuber J. Deep learning in neural networks: An overview. Neural Networks. 2015;61: 85–117. DOI:10.1016/j.neunet.2014.09.003.
  10. LeCun Y, Bengio Y, Hinton G. Deep learning. Nature. 2015;521(7553):436–444. DOI:10.1038/nature14539.
  11. Maslennikov OV, Nekorkin VI. Stimulus-induced sequential activity in supervisely trained recurrent networks of firing rate neurons. Nonlinear Dynamics. 2020;101(2):1093–1103. DOI:10.1007/s11071-020-05787-0.
  12. Pugavko MM, Maslennikov OV, Nekorkin V. Dynamics of spiking map-based neural networks in problems of supervised learning. Communications in Nonlinear Science and Numerical Simulation. 2020;90:105399. DOI:10.1016/j.cnsns.2020.105399.
  13. Pugavko MM, Maslennikov OV, Nekorkin VI. Dynamics of a network of map-based model neurons for supervised learning of a reservoir computing system. Izvestiya VUZ. Applied Nonlinear Dynamics. 2020;28(1):77–89 (in Russian). DOI:10.18500/0869-6632-2020-28-1-77-89.
  14. Maslennikov OV, Nekorkin VI. Collective dynamics of rate neurons for supervised learning in a reservoir computing system. Chaos: An Interdisciplinary Journal of Nonlinear Science. 2019;29(10):103126. DOI:10.1063/1.5119895.
  15. Maslennikov OV, Nekorkin VI. Adaptive dynamical networks. Phys. Usp. 2017;60(7):694–704. DOI:10.3367/UFNe.2016.10.037902.
  16. Romo R, Brody CD, Hernandez A, Lemus L. Neuronal correlates of parametric working memory in the prefrontal cortex. Nature. 1999;399(6735):470–473. DOI:10.1038/20939.
  17. Romo R, Salinas E. Flutter Discrimination: neural codes, perception, memory and decision making. Nature Reviews Neuroscience. 2003;4(3):203–218. DOI:10.1038/nrn1058.
  18. Barak O, Tsodyks M, Romo R. Neuronal population coding of parametric working memory. Journal of Neuroscience. 2010;30(28):9424–9430. DOI:10.1523/JNEUROSCI.1875-10.2010.
  19. Kingma DP, Ba J. Adam: A method for stochastic optimization. arXiv preprint arXiv:14126980. 2014.
  20. Koppe G, Toutounji H, Kirsch P, Lis S, Durstewitz D. Identifying nonlinear dynamical systems via generative recurrent neural networks with applications to fMRI. PLoS Computational Biology. 2019;15(8):e1007263. DOI:10.1371/journal.pcbi.1007263.
  21. Hertag L, Durstewitz D, Brunel N. Analytical approximations of the firing rate of an adaptive exponential integrate-and-fire neuron in the presence of synaptic noise. Frontiers in Computational Neuroscience. 2014;8:116. DOI:10.3389/fncom.2014.00116.
  22. Yang GR, Wang XJ. Artificial neural networks for neuroscientists: A primer. Neuron. 2020;107(6):1048–1070. DOI:10.1016/j.neuron.2020.09.005.
  23. Kobak D, Brendel W, Constantinidis C, Feierstein CE, Kepecs A, Mainen ZF, Qi XL, Romo R, Uchida N, Machens CK. Demixed principal component analysis of neural population data. eLife. 2016;5:e10989. DOI:10.7554/eLife.10989.
  24. Keemink SW, Machens CK. Decoding and encoding (de)mixed population responses. Current Opinion in Neurobiology. 2019;58:112–121. DOI:10.1016/j.conb.2019.09.004. 
Received: 
26.02.2021
Accepted: 
11.05.2021
Published: 
30.09.2021