ISSN 0869-6632 (Print)
ISSN 2542-1905 (Online)


The article published as Early Access!

This is an open access article distributed under the terms of Creative Commons Attribution 4.0 International License (CC-BY 4.0).
Full text PDF(Ru):
Language: 
Russian
Article type: 
Article
UDC: 
530.182
EDN: 

Artificial neural network with dynamic synapse model

Autors: 
Zimin Ilya Anatolevich, Lobachevsky State University of Nizhny Novgorod
Kazantsev Viktor Borisovich, Institute of Applied Physics of the Russian Academy of Sciences
Stasenko Sergey Victorovich, Lobachevsky State University of Nizhny Novgorod
Abstract: 

The purpose of this study is to develop and investigate a new short-term memory model based on an artificial neural network without short-term memory effect and a dynamic short-term memory model with astrocytic modulation.

Methods. The artificial neural network is represented by a classical convolutional neural network that does not have short-term memory. Short-term memory is modeled in our hybrid model using the Tsodyks-Markram model, which is a system of third-order ordinary differential equations. Astrocyte dynamics is modeled by a mean field model of gliotransmitter concentration.

Results. A new hybrid short-term memory model was developed and investigated using a convolutional neural network and a dynamic synapse model for an image recognition problem. Graphs of dependence of accuracy and error on the number of epochs for the presented model are given. The sensitivity metric of image recognition d-prime has been introduced. The developed model was compared with the recurrent neural network and the configuration of the new model without taking into account astrocytic modulation. A comparative table has been constructed showing the best recognition accuracy for the introduced model.

Conclusion. As a result of the study, the possibility of combining an artificial neural network and a dynamic model that expands its functionality is shown. Comparison of the proposed model with short-term memory using a convolutional neural network and a dynamic synapse model with astrocytic modulation with a recurrent network showed the effectiveness of the proposed approach in simulating short-term memory.

Acknowledgments: 
In terms of selecting parameters for a 3-dimensional dynamic synapse model, the work was supported by the scientific program of the National Center for Physics and Mathematics, section No. 9 “Artificial intelligence and big data in technical, industrial, natural and social systems”; in terms of simulation and training of the model, the work was supported within the framework of the Development Program of the Regional Scientific and Educational Mathematical Center “Mathematics of Future Technologies” (Agreement No. 075-02-2024-1439)
Reference: 
  1. Baddeley A. Working memory. Current Biology. 2010;20(4):R136–R140. DOI: 10.1016/j.cub. 2009.12.014.
  2. Miller GA. The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review. 1956;63(2):81–97. DOI: 10.1037/h0043158.
  3. Cowan N. The magical number 4 in short-term memory: A reconsideration of mental storage capacity. Behavioral and Brain Sciences. 2001;24(1):87–114. DOI: 10.1017/S0140525X01003922.
  4. Wager TD, Smith EE. Neuroimaging studies of working memory: a meta-analysis. Cognitive, Affective, & Behavioral Neuroscience. 2003;3(4):255–274. DOI: 10.3758/cabn.3.4.255.
  5. Engle RW. Working memory capacity as executive attention. Current Directions in Psychological Science. 2002;11(1):19–23.
  6. Park DC, Polk TA, Mikels JA, Taylor SF, Marshuetz C. Cerebral aging: integration of brain and behavioral models of cognitive function. Dialogues in Clinical Neuroscience. 2022;3(3):151–165. DOI: 10.31887/DCNS.2001.3.3/dcpark.
  7. Postle BR. Working memory as an emergent property of the mind and brain. Neuroscience. 2006;139(1):23–38. DOI: 10.1016/j.neuroscience.2005.06.005.
  8. Luck SJ, Vogel EK. The capacity of visual working memory for features and conjunctions. Nature. 1997;390(6657):279–281. DOI: 10.1038/36846.
  9. Hollingworth A, Henderson JM. Accurate visual memory for previously attended objects in natural scenes. Journal of Experimental Psychology: Human Perception and Performance. 2002;28(1): 113–136. DOI: 10.1037/0096-1523.28.1.113.
  10. Vogel EK, Woodman GF, Luck SJ. Pushing around the locus of selection: Evidence for the flexible-selection hypothesis. Journal of Cognitive Neuroscience. 2005;17(12):1907–1922. DOI: 10.1162/ 089892905775008599.
  11. Perea G, Araque A. Astrocytes potentiate transmitter release at single hippocampal synapses. Science. 2007;317(5841):1083–1086. DOI: 10.1126/science.1144640.
  12. Suzuki A, Stern SA, Bozdagi O, Huntley GW, Walker RH, Magistretti PJ, Alberini CM. Astrocyte-neuron lactate transport is required for long-term memory formation. Cell. 2011;144(5):810–823. DOI: 10.1016/j.cell.2011.02.018.
  13. Ango F, Wu C, Van der Want JJ, Wu P, Schachner M, Huang J. Bergmann glia and the recognition molecule CHL1 organize GABAergic axons and direct innervation of Purkinje cell dendrites. PLoS Biology. 2008;6(4):e103. DOI: 10.1371/journal.pbio.0060103.
  14. Hu B, Garrett ME, Groblewski PA, Ollerenshaw DR, Shang J, Roll K, Manavi S, Koch C, Olsen SR, Mihalas S. Adaptation supports short-term memory in a visual change detection task. PLoS Computational Biology. 2021;17(9):e1009246. DOI: 10.1371/journal.pcbi.1009246.
  15. Garrett M, Manavi S, Roll K, Ollerenshaw DR, Groblewski PA, Ponvert ND, Kiggins JT, Casal L, Mace K, Williford A, Leon A, Jia X, Ledochowitsch P, Buice MA, Wakeman W, Mihalas S, Olsen SR. Experience shapes activity dynamics and stimulus coding of VIP inhibitory cells. eLife. 2020;9:e50340. DOI: 10.7554/eLife.50340.
  16. Stasenko SV, Kazantsev VB. Dynamic image representation in a spiking neural network supplied by astrocytes. Mathematics. 2023;11(3):561. DOI: 10.3390/math11030561.
  17. Stasenko S, Kazantsev V. Astrocytes enhance image representation encoded in spiking neural network. In: Kryzhanovsky B, Dunin-Barkowski W, Redko V, Tiumentsev Y, editors. Advances in Neural Computation, Machine Learning, and Cognitive Research VI. NEUROINFORMATICS 2022. Vol. 1064 of Studies in Computational Intelligence. Cham: Springer; 2023. P. 200–206. DOI: 10.1007/978-3-031-19032-2_20.
  18. Lazarevich IA, Stasenko SV, Kazantsev VB. Synaptic multistability and network synchronization induced by the neuron–glial interaction in the brain. Jetp Lett. 2017;105(3):210–213. DOI: 10.1134/ S0021364017030092.
  19. Stasenko SV, Lazarevich IA, Kazantsev VB. Quasi-synchronous neuronal activity of the network induced by astrocytes. Procedia Computer Science. 2020;169:704–709. DOI: 10.1016/j.procs.2020. 02.175.
  20. Barabash N, Levanova T, Stasenko S. STSP model with neuron — glial interaction produced bursting activity. In: 2021 Third International Conference Neurotechnologies and Neurointerfaces (CNN). 13–15 September 2021, Kaliningrad, Russian Federation. IEEE; 2021. P. 12–15. DOI: 10. 1109/CNN53494.2021.9580314.
  21. Stasenko S, Kazantsev V. 3D model of bursting activity generation. In: 2022 Fourth International Conference Neurotechnologies and Neurointerfaces (CNN). 14-16 September 2022, Kaliningrad, Russian Federation. IEEE; 2022. P. 176–179. DOI: 10.1109/CNN56452.2022.9912507.
  22. Barabash N, Levanova T, Stasenko S. Rhythmogenesis in the mean field model of the neuron–glial network. Eur. Phys. J. Spec. Top. 2023;232(5):529–534. DOI: 10.1140/epjs/s11734-023-00778-9.
  23. Olenin SM, Levanova TA, Stasenko SV. Dynamics in the reduced mean-field model of neuron– glial interaction. Mathematics. 2023;11(9):2143. DOI: 10.3390/math11092143.
  24. Tsodyks M, Pawelzik K, Markram H. Neural networks with dynamic synapses. Neural Computation. 1998;10(4):821–835. DOI: 10.1162/089976698300017502.
  25. Krizhevsky A. Learning Multiple Layers of Features from Tiny Images. Technical Report TR-2009. Toronto: University of Toronto; 2009. 60 p.
  26. Paszke A, Gross S, Massa F, Lerer A, Bradbury J, Chanan G, Killeen T, Lin Z, Gimelshein N, Antiga L, Desmaison A, Kopf A, Yang E, DeVito Z, Raison M, Tejani A, Chilamkurthy S, Steiner B, Fang L, Bai J, Chintala S. PyTorch: An imperative style, high-performance deep learning library. In: NIPS’19: Proceedings of the 33rd International Conference on Neural Information Processing Systems. No. 721. Vancouver, Canada: NeurIPS; 2019. P. 8026–8037.
Received: 
05.10.2023
Accepted: 
04.12.2023
Available online: 
01.04.2024