For citation:
Semenova N. I. The impact of internal noise on the performance of convolutional neural network. Izvestiya VUZ. Applied Nonlinear Dynamics, 2025, vol. 33, iss. 6, pp. 898-916. DOI: 10.18500/0869-6632-003177, EDN: VYXWAN
The impact of internal noise on the performance of convolutional neural network
Purpose. This study aims to establish the characteristics of noise propagation and accumulation in convolutional neural networks. The article investigates how the accuracy of a trained convolutional network varies depending on the type and intensity of noise exposure.
Methods. White Gaussian noise sources were used as the basis for noise exposure. Two types of noise exposure were applied to artificial neurons: additive and multiplicative. Additionally, the effects of correlated and uncorrelated noise on the layers of neurons were examined.
Results. The findings indicate that additive noise (both correlated and uncorrelated) accumulates more significantly in networks with convolutional layers compared to those without. The relationship between network accuracy and the intensity of multiplicative correlated noise is similar for both types of networks. However, the impact of multiplicative uncorrelated noise is more favorable for networks with convolutional layers. The study also considered pooling layers, specifically MaxPooling and MeanPooling, which significantly enhance accuracy in the presence of additive noise within the convolutional layer. The decline in accuracy due to increasing intensity of multiplicative correlated noise is nearly identical for networks with and without pooling layers. Conversely, networks employing MaxPooling demonstrate reduced resilience to uncorrelated multiplicative noise.
Conclusion. The study demonstrates that additive noise severely degrades network performance when a convolutional layer is present, though this negative effect can be mitigated by including a pooling layer immediately following the convolutional layer. In contrast, the effects of multiplicative noise are less clear-cut. In most cases, its impact remains consistent regardless of the presence of convolution and pooling layers. However, the use of MaxPooling in the pooling layer may compromise the network’s robustness against multiplicative uncorrelated noise.
- LeCun Y, Bengio Y, Hinton G. Deep learning. Nature. 2015;521(7553):436-444. 10.1038/nature1453910.1038/nature14539.
- Krizhevsky A, Sutskever I, Hinton G E. ImageNet classification with deep convolutional neural networks. Commun. ACM. 2017;60(6):84-90 DOI: 10.1145/3065386.
- Maturana D, Scherer S. VoxNet: A 3D Convolutional Neural Network for real-time object recognition. In: 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 2015, Hamburg, Germany. 2015. P. 922-928 DOI: 10.1109/IROS.2015.7353481.
- Graves A, Mohamed A, Hinton G. Speech recognition with deep recurrent neural networks. In: 2013 IEEE International Conference on Acoustics, Speech and Signal Processing. 2013, Vancouver, BC, Canada. 2013. P. 6645-6649 DOI: 10.1109/ICASSP.2013.6638947.
- Kar S, Moura JM F. Distributed consensus algorithms in sensor networks with imperfect communica-tion: Link failures and channel noise. IEEE Transactions on Signal Processing. 2009;57(1):355-369 DOI: 10.1109/TSP.2008.2007111.
- AI and compute . [Electronic resource]. 2018. Available from: https://openai.com/index/ai-and-compute/https://openai.com/index/ai-and-....
- Hasler J, Marr H B. Finding a roadmap to achieve large neuromorphic hardware systems. Front. Neurosci. 2013;7:118 DOI: 10.3389/fnins.2013.00118.
- Gupta S, Agrawal A, Gopalakrishnan K, Narayanan P. Deep Learning with Limited Numerical Precision. In: Proceedings of the 32nd International Conference on International Conference on Machine Learning. 2015. Vol. 37. P. 1737-1746 DOI: 10.1109/72.80206.
- Karniadakis GE, Kevrekidis IG, Lu L, Perdikaris P, Wang S, Yang L. Physics-informed machine learning. Nat. Rev. Phys. 2021;3:422-440 DOI: 10.1038/s42254-021-00314-5.
- Aguirre F, Sebastian A, Le Gallo M, Song W, Wang T, Yang JJ, Lu W, Chang M-F, Ielmini D, Yang Y, Mehonic A, Kenyon A, Villena MA, Roldán JB, Wu Y, Hsu Hu-H, Raghavan N, Suñé J, Miranda E, Eltawil A, Setti G, Smagulova K, Salama KN, Krestinskaya O, Yan X, Ang K-W, Jain S, Li S, Alharbi O, Pazos S, Lanza M. Hardware implementation of memristor-based artificial neural networks. Nat. Commun. 2024;15:1974 DOI: 10.1038/s41467-024-45670-9.
- Chen Y, Nazhamaiti M, Xu H, Meng Y, Zhou T, Li G, Fan J, Wei Q, Wu J, Qiao F, Fang L, Dai Q. All-analog photoelectronic chip for high-speed vision tasks. Nature. 2023;623:48-57 DOI: 10.1038/s41586-023-06558-8.
- Brunner D, Soriano MC, Mirasso CR, Fischer I. Parallel photonic information processing at gigabyte per second data rates using transient states. Nat. Commun. 2023;4:1364. 10.1038/ncomms236810.1038/ncomms2368.
- Tuma T, Pantazi A, Le Gallo M, Sebastian A, Eleftheriou E. Stochastic phase-change neurons. Nature Nanotech. 2016;11:693-699 DOI: 10.1038/nnano.2016.70.
- Torrejon J, Riou M, Araujo F, Tsunegi S, Khalsa G, Querlioz D, Bortolotti P, Cros V, Yakushiji K, Fukushima A, Kubota H, Yuasa Sh, Stiles MD, Grollier J. Neuromorphic computing with nanoscale spintronic oscillators. Nature. 2017;547:428-431. . 1 DOI: 10.1038/nature23011.
- Psaltis D, Brady D, Gu XG, Lin S. Holography in artificial neural networks. Nature. 1990;343:325-330 DOI: 10.1038/343325a0.
- Bueno J, Maktoobi S, Froehly L, Fischer I, Jacquot M, Larger L, Brunner D. Reinforcement learning in a large-scale photonic recurrent neural network. Optica. 2018;5(6):756-760 DOI: 10.1364/OPTICA.5.000756.
- Lin X, Rivenson Y, Yardimci NT, Veli M, Jarrahi M, Ozcan A. All-optical machine learning using diffractive deep neural networks. Science. 2018;361:1004-1008 DOI: 10.1126/science.aat8084.
- Shen Y, Harris NC, Skirlo S, Prabhu M, Baehr-Jones T, Hochberg M, Sun X, Zhao S, Larochelle H, Englund D, Soljacic M. Deep learning with coherent nanophotonic circuits. Nature Photonics. 2017;11:441-446 DOI: 10.1038/nphoton.2017.93.
- Tait AN, De Lima TF, Zhou E, Wu AX, Nahmias MA, Shastri BJ, Prucnal P R. Neuromorphic photonic networks using silicon photonic weight banks. Sci. Rep. 2017;7(1):7430. 10.1038/s41598-017-07754-z10.1038/s41598-017-07754-z.
- Moughames J, Porte X, Thiel M, Ulliac G, Larger L, Jacquot M, Kadic M, Brunner D. Three-dimensional waveguide interconnects for scalable integration of photonic neural networks. Optica. 2020;7(6):640-646 DOI: 10.1364/OPTICA.388205.
- Dinc NU, Psaltis D, Brunner D. Optical neural networks: The 3D connection. Photoniques. 2020;104:34-38 DOI: 10.1051/photon/202010434.
- Moughames J, Porte X, Larger L, Jacquot M, Kadic M, Brunner D. 3D printed multimode-splitters for photonic interconnects. Optical Materials Express. 2020;10(11):2952-2961. 10.1364/OME.40297410.1364/OME.402974.
- Semenova N, Larger L, Brunner D. Understanding and mitigating noise in trained deep neural networks. Neural Netw. 2022;146:151-160 DOI: 10.1016/j.neunet.2021.11.008.
- Semenova N. Impact of white Gaussian internal noise on analog echo-state neural networks. arXiv:2405.07670. arXiv Preprint; 2024. 10 p DOI: 10.48550/arXiv.2405.07670.
- Semenova N, Brunner D. Noise-mitigation strategies in physical feedforward neural networks. Chaos. 2022;32(6):061106 DOI: 10.1063/5.0096637.
- Semenova N, Brunner D. Impact of white noise in artificial neural networks trained for classification: Performance and noise mitigation strategies. Chaos. 2024;34(5):051101 DOI: 10.1063/5.0206807.
- Semenova N, Porte X, Andreoli L, Jacquot M, Larger L, Brunner D. Fundamental aspects of noise in analog-hardware neural networks. Chaos. 2019;29(20):103128 DOI: 10.1063/1.5120824.
- Li Z, Liu F, Yang W, Peng S, Zhou J. A survey of convolutional neural networks: Analysis, applications, and prospects. IEEE Transactions on Neural Networks and Learning Systems. 2022;33(12):6999-7019 DOI: 10.1109/TNNLS.2021.3084827.
- LeCun Y. The MNIST database of handwritten digits. [Electronic resourse] // 1998. Available from: http://yann.lecun.com/exdb/mnist/.
- Goodfellow I, Bengio Y, Courville A. Deep Learning. MIT Press; 2016. 800 p.
- Chollet F et a l. Keras. [Electronic resource]. 2015. Available from: https://github.com/fchollet/keras.
- Stephanie MV, Pham L, Schindler A, Grasser T, Waltl M, Schrenk B. Photonic neuron with on frequency-domain ReLU activation function. Journal of Lightwave Technology. 2024;42(22):7919-7928 DOI: 10.1109/JLT.2024.3413976.
- Li GHY, Sekine R, Nehra R, Gray RM, Ledezma L, Guo Q, Marandi A. All-optical ultrafast ReLU function for energy-efficient nanophotonic deep learning. Nanophotonics. 2023;12(5):847-855 DOI: 10.1515/nanoph-2022-0137.
- 805 reads