For citation:
Semenova N. I. Recurrent neural network consisting of FitzHugh–Nagumo systems: characteristics required for training. Izvestiya VUZ. Applied Nonlinear Dynamics, 2025, vol. 33, iss. 4, pp. 590-604. DOI: 10.18500/0869-6632-003166, EDN: QXCLTF
Recurrent neural network consisting of FitzHugh–Nagumo systems: characteristics required for training
The purpose of this study is to determine the feasibility and features of training a recurrent neural network consisting of FitzHugh-Nagumo systems with delayed feedback to predict an impulse (spike signal).
Methods. The network under consideration consisted of N=60 FitzHugh–Nagumo systems with different lag times. During training, the problem of which neuron should be activated and with what strength of lagged feedback was solved. The network was trained using gradient descent from different initial conditions. In the process of research, it was found that the use of standard recurrent network training characteristics such as mean squared error or mean absolute error was not applicable to this task, so an alternative method for computing the loss function was proposed.
Results. The proposed combined loss function is the sum of MSE error and interspike interval error, and therefore has the following advantages: 1 – includes the information about spike periodicity and interspike intervals, 2 – responds adequately to the absence of a network output signal, 3 – takes into account small amplitude fluctuations in addition to impulse dynamics, which allows predicting complex quasi-periodic signals. It has been shown that gradient descent can be used for the problem, but several initial conditions must be used because of the nonlinearity of the loss function. The more initial conditions, the more accurate the result.
Conclusion. The problem of predicting a pulse (spike) signal using a self-closed recurrent neural network consisting of FitzHughNagumo systems with delayed feedback has been successfully solved. It was clearly shown what features should be taken into account during loss function calculation and how the gradient descent should be realized.
- Yamazaki K, Vo-Ho V-K, Bulsara D, Le N. Spiking neural networks and their applications: A review. Brain Sci. 2022;12(7):863. DOI: 10.3390/brainsci12070863.
- Benjamin BV, Gao P, McQuinn E, Choudhary S, Chandrasekaran AR, Bussat JM, AlvarezIcaza R, Arthur JV, Merolla PA, Boahen K. Neurogrid: A Mixed-Analog-Digital Multichip System for Large-Scale Neural Simulations. Proceedings of the IEEE. 2014;102(5):699–716. DOI: 10.1109/JPROC.2014.2313565.
- Schuman CD, Potok TE, Patton RM, Birdwell JD, Dean ME, Rose GS, Plank JS. A survey of neuromorphic computing and neural networks in hardware. arXiv:1705.06963. arXiv Preprint, 2017. DOI: 10.48550/arXiv.1705.06963.
- Ghimire D, Kil D, Kim S. A survey on efficient convolutional neural networks and hardware acceleration. Electronics. 2022;11(6):945. DOI: 10.3390/electronics11060945.
- Aguirre F, Sebastian A, Le Gallo M, Song W, Wang T, Yang JJ, Lu W, Chang M-F, Ielmini D, Yang Y, Mehonic A, Kenyon A, Villena MA, Roldan JB, Wu Y, Hsu H-H, Raghavan N, Su ne J, Miranda E, Eltawil A, Setti G, Smagulova K, Salama KN, Krestinskaya O, Yan X, Ang K-W, Jain S, Li S, Alharbi O, Pazos S, Lanza M. Hardware implementation of memristor-based artificial neural networks. Nat. Commun.. 2024;15(1):1974. DOI: 10.1038/s41467-024-45670-9.
- Chen Y, Nazhamaiti M, Xu H, Meng Y, Zhou T, Li G, Fan J, Wei Q, Wu J, Qiao F, Fang L, Dai Q. All-analog photoelectronic chip for high-speed vision tasks. Nature. 2023;623(7985):48–57. DOI: 10.1038/s41586-023-06558-8.
- Brunner D, Soriano MC, Mirasso CR, Fischer I. Parallel photonic information processing at gigabyte per second data rates using transient states. Nat. Commun. 2013;4:1364. DOI: 10.1038/ ncomms2368.
- McMahon PL. The physics of optical computing Nat. Rev. Phys. 2023;5(12):717–734. DOI: 10.1038/s42254-023-00645-5.
- Tuma T, Pantazi A, Le Gallo M, Sebastian A, Eleftheriou E. Stochastic phase-change neurons. Nature Nanotechnology. 2016;11:693–699. DOI: 10.1038/nnano.2016.70.
- Torrejon J, Riou M, Araujo FA, Tsunegi S, Khalsa G, Querlioz D, Bortolotti P, Cros V, Yakushiji K, Fukushima A, Kubota H, Yuasa S, Stiles MD, Grollier J. Neuromorphic computing with nanoscale spintronic oscillators. Nature. 2017;547(7664):428–431. DOI: 10.1038/nature23011.
- Ponulak F, Kasinski A. Introduction to spiking neural networks: Information processing, learning and applications. Acta Neurobiologiae Experimentalis. 2011;71(4):409—433. DOI: 10.55782/ane- 2011-1862.
- Ghosh-Dastidar S, Adeli H. Spiking neural networks. International Journal of Neural Systems. 2009;19(4):295–308. DOI: 10.1142/S0129065709002002.
- Merolla PA, Arthur JV, Alvarez-Icaza R, Cassidy AS, Sawada J, Akopyan F, Jackson BL, Imam N, Guo C, Nakamura Yu, Brezzo B, Vo I, Esser SK, Appuswamy R, Taba B, Amir A, Flickner MD, Risk WP, Manohar R, Modha DS. A million spiking-neuron integrated circuit with a scalable communication network and interface. Science. 2014;345(6197):668–673. DOI: 10.1126/science. 1254642.
- Davies M, Srinivasa N, Lin T-H, Chinya G, Cao Y, Choday SH, Dimou G, Joshi P, Imam J, Jain S, Liao Y, Lin C-K, Lines A, Liu R, Mathaikutty D, McCoy S, Paul A, Tse J, Venkataramanan G, Weng Y-H, Wild A, Yang Y, Wang H. Loihi: A Neuromorphic manycore processor with On-Chip Learning. IEEE Micro. 2018;38(1):82–99. DOI: 10.1109/MM.2018.112130359.
- Furber SB, Galluppi F, Temple S, Plana LA. The SpiNNaker Project. Proceedings of the IEEE. 2014;102(5):652–665. DOI: 10.1109/JPROC.2014.2304638.
- Tavanaei A, Ghodrati M, Kheradpisheh SR, Masquelier T, Maida A. Deep learning in spiking neural networks. Neural Networks. 2019;111:47–63. DOI: 10.1016/j.neunet.2018.12.002.
- LeCun Y, Bottou L, Bengio Y, Haffner P. Gradient-based learning applied to document recognition. Proceedings of the IEEE. 1998;86(11):2278–2324. DOI: 10.1109/5.726791.
- Krizhevsky A, Hinton G. Learning multiple layers of features from tiny images. Technical Report. Toronto: University of Toronto; 2009. Available from: http://www.cs.utoronto.ca/~kriz/ learning-features-2009-TR.pdf.
- Xiao H, Rasul K, Vollgraf R. Fashion-MNIST: A novel image Dataset for benchmarking machine learning algorithms. arXiv:1708.07747. arXiv Preprint, 2017. DOI: 10.48550/arXiv.1708.07747.
- Nunes JD, Carvalho M, Carneiro D, Cardoso JS. Spiking Neural Networks: A survey. IEEE Access. 2022;10:60738–60764. DOI: 10.1109/ACCESS.2022.3179968.
- Han B, Roy K. Deep Spiking Neural Network: Energy Efficiency Through Time Based Coding. In: Vedaldi A, Bischof H, Brox T, Frahm JM, editors. Computer Vision – ECCV 2020. ECCV 2020. Lecture Notes in Computer Science. Vol. 12355. Cham: Springer; 2020. P. 388–404. DOI: 10.1007/978-3-030-58607-2_23.
- FitzHugh R. Impulses and physiological states in theoretical models of nerve membrane. Biophys. J. 1961;1(6):445–466. DOI: 10.1016/S0006-3495(61)86902-6.
- Nagumo J, Arimoto S, Yoshizawa S. An active pulse transmission line simulating nerve axon. Proceedings of the IRE. 1962;50(10):2061–2070. DOI: 10.1109/JRPROC.1962.288235.
- Bogatenko T, Sergeev K, Slepnev A, Kurths J, Semenova N. Symbiosis of an artificial neuralnetwork and models of biological neurons: Training and testing. Chaos. 2023;33(7):073122. DOI: 10.1063/5.0152703.
- LeCun Y, Cortes C, Burges CJC. The MNIST database of handwritten digits [Electronic resource] Available from: http://yann.lecun.com/exdb/mnist/.
- Semenov VV, Bukh AV, Semenova N. Delay-induced self-oscillation excitation in the FitzhughNagumo model: Regular and chaotic dynamics. Chaos, Solitons & Fractals. 2023;172:113524. DOI: 10.1016/j.chaos.2023.113524.
- Pikovsky AS, Kurths J. Coherence resonance in a noise-driven excitable system. Phys. Rev. Lett. 1997;78(5):775–778. DOI: 10.1103/PhysRevLett.78.775.
- Pyragas K. Continuous control of chaos by self-controlling feedback. Physics Letters A. 1992; 170(6):421–428. DOI: 10.1016/0375-9601(92)90745-8.
- Scholl E, Hiller G, H ovel P, Dahlem MA. Time-delayed feedback in neurosystems. Phil. Trans. R. Soc. A. 2009;367(1891):1079–1096. DOI: 10.1098/rsta.2008.0258.
- Just W, Pelster A Schanz M, Scholl E. Delayed complex systems: an overview. Phil. Trans. R. Soc. A. 2010;368(1911):303–304. DOI: 10.1098/rsta.2009.0243.
- Masoliver M, Malik N, Scholl E, Zakharova A. Coherence resonance in a network of FitzHugh- Nagumo systems: Interplay of noise, time-delay, and topology. Chaos. 2017;27(10):101102. DOI: 10.1063/1.5003237.
- Ruder S. An overview of gradient descent optimization algorithms. arXiv:1609.04747. arXiv Preprint, 2017. DOI: 10.48550/arXiv.1609.04747.
- 1096 reads