ISSN 0869-6632 (Print)
ISSN 2542-1905 (Online)


For citation:

Kuptsov P. V., Stankevich N. V. Modeling of the Hodgkin–Huxley neural oscillators dynamics using an artificial neural network. Izvestiya VUZ. Applied Nonlinear Dynamics, 2024, vol. 32, iss. 1, pp. 72-95. DOI: 10.18500/0869-6632-003079, EDN: VCXHMY

This is an open access article distributed under the terms of Creative Commons Attribution 4.0 International License (CC-BY 4.0).
Full text PDF(Ru):
Language: 
Russian
Article type: 
Article
UDC: 
537.86
EDN: 

Modeling of the Hodgkin–Huxley neural oscillators dynamics using an artificial neural network

Autors: 
Kuptsov Pavel Vladimirovich, Saratov Branch of Kotel`nikov Institute of Radiophysics and Electronics of Russian Academy of Sciences
Stankevich Nataliya Vladimirovna, National Research University "Higher School of Economics"
Abstract: 

The purpose of this study — to represent a detailed description of the procedure for creating and training a neural network mapping on the example of the dynamics modeling of a neural oscillator of the Hodgkin–Huxley type; to show that the neural network mappings trained for single oscillators can be used as elements of a coupled system that simulate the behavior of coupled oscillators.

Methods. Numerical method is used for solving stiff systems of ordinary differential equations. Also a procedure for training neural networks based on the method of back propagation of error is employed together with the Adam optimization algorithm, that is a modified version of the gradient descent supplied with an automatic step adjustment.

Results. It is shown that the neural network mappings built according to the described procedure are able to reproduce the dynamics of single neural oscillators. Moreover, without additional training, these mappings can be used as elements of a coupled system for the dynamics modeling of coupled neural oscillator systems.

Conclusion. The described neural network mapping can be considered as a new universal framework for complex dynamics modeling. In contrast to models based on series expansion (power, trigonometric), neural network mapping does not require truncating of the series. Consequently, it allows modeling processes with arbitrary order of nonlinearity, hence there are reasons to believe that in some aspects it will be more effective. The approach developed in this paper based on the neural network mapping can be considered as a sort of an alternative to the traditional numerical methods of modeling of dynamics. What makes this approach topical is the current rapid development of technologies for creating fast computing equipment that supports neural network training and operation.

Acknowledgments: 
The study of mathematical models was carried out within the framework of the Mirror Laboratories Project of HSE University (Sections 1, 2). Development and research of the neural network mapping (Sections 3–5) was supported by the Russian Science Foundation, 20-71-10048
Reference: 
  1. Levin E, Gewirtzman R, Inbar GF. Neural network architecture for adaptive system modeling and control. Neural Networks. 1991;4(2):185–191. DOI: 10.1016/0893-6080(91)90003-N.
  2. Grieger B, Latif M. Reconstruction of the El Nino attractor with neural networks. Climate Dynamics. 1994;10(6–7):267–276. DOI: 10.1007/BF00228027.
  3. Zimmermann HG, Neuneier R. Combining state space reconstruction and forecasting by neural networks. In: Bol G, Nakhaeizadeh G, Vollmer KH, editors. Datamining und Computational Finance. Vol. 174 of Wirtschaftswissenschaftliche Beitrage. Heidelberg: Physica; 2000. P. 259–267. DOI: 10.1007/978-3-642-57656-0_13.
  4. Gilpin W, Huang Y, Forger DB. Learning dynamics from large biological data sets: Machine learning meets systems biology. Current Opinion in Systems Biology. 2020;22:1–7. DOI: 10.1016/ j.coisb.2020.07.009.
  5. Kolmogorov AN. On the representation of continuous functions of several variables by superpositions of continuous functions of a smaller number of variables. Amer. Math. Soc. Transl. 1961;17:369–373.
  6. Arnold VI. On functions of three variables. Amer. Math. Soc. Transl. 1963;28:51–54.
  7. Kolmogorov AN. On the representation of continuous functions of many variables by superposition of continuous functions of one variable and addition. Amer. Math. Soc. Transl. 1963;28:55–59.
  8. Cybenko G. Approximation by superpositions of a sigmoidal function. Mathematics of Control, Signals and Systems. 1989;2(4):303–314. DOI: 10.1007/BF02551274.
  9. Gorban AN. Generalized approximation theorem and the exact representation of polynomials in several variables via the superpositions of polynomials in one variable. Russian Mathematics. 1998;42(5):4–7.
  10. Haykin S. Neural Networks: A Comprehensive Foundation. Upper Saddle River, NJ: Prentice Hall; 1999. 842 p.
  11. Nikolenko S, Kadurin A, Arkhangelskaya E. Deep Learning. Saint Petersburg: Piter; 2018. 480 p. (in Russian).
  12. Cook S. CUDA Programming: A Developer’s Guide to Parallel Computing with GPUs. Morgan Kaufmann; 2012. 592 p.
  13. Jouppi NP, Young C, Patil N, Patterson D, Agrawal G, Bajwa R, Bates S, Bhatia S, Boden N, Borchers A, Boyle R, Cantin PL, Chao C, Clark C, Coriell J, Daley M, Dau M, Dean J, Gelb B, Ghaemmaghami TV, Gottipati R, Gulland W, Hagmann R, Ho CR, Hogberg D, Hu J, Hundt R, Hurt D, Ibarz J, Jaffey A, Jaworski A, Kaplan A, Khaitan H, Killebrew D, Koch A, Kumar N, Lacy S, Laudon J, Law J, Le D, Leary C, Liu Z, Lucke K, Lundin A, MacKean G, Maggiore A, Mahony M, Miller K, Nagarajan R, Narayanaswami R, Ni R, Nix K, Norrie T, Omernick M, Penukonda N, Phelps A, Ross J, Ross M, Salek A, Samadiani E, Severn C, Sizikov G, Snelham M, Souter J, Steinberg D, Swing A, Tan M, Thorson G, Tian B, Toma H, Tuttle E, Vasudevan V, Walter R, Wang W, Wilcox E, Yoon DH. In-datacenter performance analysis of a Tensor Processing Unit. ACM SIGARCH Computer Architecture News. 2017;45(2):1–12. DOI: 10.1145/3140659.3080246.
  14. Welser J, Pitera JW, Goldberg C. Future computing hardware for AI. In: 2018 IEEE International Electron Devices Meeting (IEDM). 1-5 December 2018, San Francisco, CA, USA. New York: IEEE; 2018. P. 131–136. DOI: 10.1109/IEDM.2018.8614482.
  15. Karras K, Pallis E, Mastorakis G, Nikoloudakis Y, Batalla JM, Mavromoustakis CX, Markakis E. A hardware acceleration platform for AI-based inference at the edge. Circuits, Systems, and Signal Processing. 2020;39(2):1059–1070. DOI: 10.1007/s00034-019-01226-7.
  16. Kuptsov PV, Kuptsova AV, Stankevich NV. Artificial neural network as a universal model of nonlinear dynamical systems. Russian Journal of Nonlinear Dynamics. 2021;17(1):5–21. DOI: 10.20537/nd210102.
  17. Kuptsov PV, Stankevich NV, Bagautdinova ER. Discovering dynamical features of Hodgkin– Huxley-type model of physiological neuron using artificial neural network. Chaos, Solitons & Fractals. 2023;167:113027. DOI: 10.1016/j.chaos.2022.113027.
  18. Sherman A, Rinzel J, Keizer J. Emergence of organized bursting in clusters of pancreatic beta-cells by channel sharing. Biophysical Journal. 1988;54(3):411–425. DOI: 10.1016/S0006- 3495(88)82975-8.
  19. Stankevich N, Mosekilde E. Coexistence between silent and bursting states in a biophysical Hodgkin-Huxley-type of model. Chaos. 2017;27(12):123101. DOI: 10.1063/1.4986401.
  20. Malashchenko T, Shilnikov A, Cymbalyuk G. Six types of multistability in a neuronal model based on slow calcium current. PLoS ONE. 2011;6(7):e21782. DOI: 10.1371/journal.pone.0021782.
  21. Rozhnova MA, Pankratova EV, Stasenko SV, Kazantsev VB. Bifurcation analysis of multistability and oscillation emergence in a model of brain extracellular matrix. Chaos, Solitons & Fractals. 2021;151:111253. DOI: 10.1016/j.chaos.2021.111253.
  22. Pankratova EV, Sinitsina MS, Gordleeva S, Kazantsev VB. Bistability and chaos emergence in spontaneous dynamics of astrocytic calcium concentration. Mathematics. 2022;10(8):1337. DOI: 10.3390/math10081337.
  23. Press WH, Teukolsky SA, Vetterling WT, Flannery BP. Numerical Recipes: The Art of Scientific Computing. 3rd Edition. New York: Cambridge University Press; 2007. 1256 p.
  24. Shilnikov A, Cymbalyuk G. Transition between tonic spiking and bursting in a neuron model via the blue-sky catastrophe. Phys. Rev. Lett. 2005;94(4):048101.DOI:10.1103/PhysRevLett.94.048101.
  25. Markovic D, Mizrahi A, Querlioz D, Grollier J. Physics for neuromorphic computing. Nature Reviews Physics. 2020;2:499–510. DOI: 10.1038/s42254-020-0208-2.
  26. Stankevich N, Koseska A. Cooperative maintenance of cellular identity in systems with intercellular communication defects. Chaos. 2020;30(1):013144. DOI: 10.1063/1.5127107.
  27. Kingma DP, Ba J. Adam: A method for stochastic optimization. arXiv:1412.6980. arXiv Preprint; 2014. DOI: 10.48550/arXiv.1412.6980. 
Received: 
27.04.2023
Accepted: 
08.09.2023
Available online: 
10.12.2023
Published: 
31.01.2024