ISSN 0869-6632 (Print)
ISSN 2542-1905 (Online)


For citation:

Sutyagin A. A., Kanakov O. I. Learning mechanism for a collective classifier based on competition driven by training examples. Izvestiya VUZ. Applied Nonlinear Dynamics, 2024, vol. 32, iss. 2, pp. 160-179. DOI: 10.18500/0869-6632-003089, EDN: WBNLWM

This is an open access article distributed under the terms of Creative Commons Attribution 4.0 International License (CC-BY 4.0).
Full text PDF(Ru):
Full text PDF(En):
Language: 
Russian
Article type: 
Article
UDC: 
530.182
EDN: 

Learning mechanism for a collective classifier based on competition driven by training examples

Autors: 
Sutyagin A. A., Lobachevsky State University of Nizhny Novgorod
Kanakov Oleg Igorevich, Lobachevsky State University of Nizhny Novgorod
Abstract: 

The purpose of this work is to modify the learning mechanism of a collective classifier in order to provide learning by population dynamics alone, without requiring an external sorting device. A collective classifier is an ensemble of non-identical simple elements, which do not have any intrinsic dynamics neither variable parameters; the classifier admits learning by adjusting the composition of the ensemble, which was provided in the preceding literature by selecting the ensemble elements using a sorting device.

Methods. The population dynamics model of a collective classifier is extended by adding a “learning subsystem”, which is controlled by a sequence of training examples and, in turn, controls the strength of intraspecific competition in the population dynamics. The learning subsystem dynamics is reduced to a linear mapping with random parameters expressed via training examples. The solution to the mapping is an asymptotically stationary Markovian random process, for which we analytically find asymptotic expectation and show its variance to vanish in the limit under the specified assumptions, thus allowing an approximate deterministic description of the coupled population dynamics based on available results from the preceding literature.

Results. We show analytically and illustrate it by numerical simulation that the decision rule of our classifier in the course of learning converges to the Bayesian rule under assumptions which are essentially in line with available literature on collective classifiers. The implementation of the required competitive dynamics does not require an external sorting device.

Conclusion. We propose a conceptual model for a collective classifier, whose learning is fully provided by its own population dynamics. We expect that our classifier, similarly to the approaches taken in the preceding literature, can be implemented as an ensemble of living cells equipped with synthetic genetic circuits, when a mechanism of population dynamics with synthetically controlled intraspecific competition becomes available.

Acknowledgments: 
This work was supported by the Ministry of Science and Higher Education of the Russian Federation (project No. FSWR-2023-0031)
Reference: 
  1. Aivazyan SA, Buchstaber VM, Yenyukov IS, Meshalkin LD. Applied Statistics: Classification and Reduction of Dimensionality. Moscow: Finansy i Statistika; 1989. 608 p. (in Russian).
  2. Alpaydin E. Introduction to Machine Learning. Fourth Edition. Cambridge, Massachusetts: MIT Press; 2020. 683 p.
  3. Sutyagin AA, Kanakov OI. Collective classifier learning strategy based upon competition in the coexistence regime. Izvestiya VUZ. Applied Nonlinear Dynamics. 2021;29(2):220–239 (in Russian). DOI: 10.18500/0869-6632-2021-29-2-220-239.
  4. Didovyk A, Kanakov OI, Ivanchenko MV, Hasty J, Huerta R, Tsimring L. Distributed classifier based on genetically engineered bacterial cell cultures. ACS Synthetic Biology. 2015;4(1):72–82. DOI: 10.1021/sb500235p.
  5. Kanakov O, Kotelnikov R, Alsaedi A, Tsimring L, Huerta R, Zaikin A, Ivanchenko M. Multi-input distributed classifiers for synthetic genetic circuits. PLoS ONE. 2015;10(5):e0125144. DOI: 10.1371/journal.pone.0125144.
  6. Goh BS. Global stability in many-species systems. The American Naturalist. 1977;111(977): 135–143. DOI: 10.1086/283144.
  7. Gnedenko BV. A Course in Probability Theory. Moscow: LENAND; 2022. 456 p. (in Russian). 
Received: 
21.10.2023
Accepted: 
30.11.2023
Available online: 
12.01.2024
Published: 
29.03.2024