ISSN 0869-6632 (Print)
ISSN 2542-1905 (Online)


The article published as Early Access!

This is an open access article distributed under the terms of Creative Commons Attribution 4.0 International License (CC-BY 4.0).
Full text PDF(Ru):
Language: 
Russian
Article type: 
Article
UDC: 
519.722
EDN: 

Definition of information in computer science

Autors: 
Kuzenkov Oleg Anatolevich, Lobachevsky State University of Nizhny Novgorod
Abstract: 

Purpose of this study is to formulate a working definition of information to meet the needs of computer science. There is currently no strict definition of this term. There is a methodological contradiction: the development and application of information technologies requires accuracy and rigor, but at the same time the development is based on a vague, intuitive concept.

Materials and methods. The materials for the study are existing classical approaches to understanding information, and the main method is the analysis of these approaches. The proposed definition is constructed taking into account two mathematical transformations: the selection of a certain subset and the mapping between sets. To formalize the allocation procedure, it is used apparatus of fuzzy sets.

Results. A definition of information is proposed as the result of a mapping in which the selection of a subset from a set of prototypes leads to the selection of a corresponding subset from a set of images. The selected subset can be understood as fuzzy, then an equivalent definition of information is acceptable as a result of mapping, in which an increase in the heterogeneity of the distribution of the presence indicator on the set of prototypes leads to an increase in the heterogeneity of the distribution of the corresponding indicator on the set of images. The essence of the new definition is demonstrated using models of population dynamics in discrete time. The significance of the proposed approach for information technology is revealed using the example of the numerical method of multi-extremal optimization. It is shown that the proposed definition makes it possible to formulate effective stopping conditions for the numerical method of stochastic optimization, which guarantees the receipt of a given amount of information.

Conclusion. The proposed understanding of information allows us to overcome the shortcomings of previous approaches to understanding the essence of information, retains all the advantages of the classical approach and is consistent with other well-known approaches in the field of computer science. This definition can be used to improve numerical optimization methods, as well as other information technology tools.

Reference: 
  1. Kosta A, Pappas N, Angelakis V. Age of Information: A New Concept, Metric, and Tool. Foundations and Trends in Networking. 2017;12(3):162–259. DOI: 10.1561/1300000060.
  2. Maatouk A, Kriouile S, Assaad M, Ephremides A. The Age of Incorrect Information: A New Performance Metric for Status Updates. IEEE/ACM Transactions on Networking. 2020;28(5):2215– 2228. DOI: 10.1109/TNET.2020.3005549.
  3. Leinster T. Entropy and diversity: the axiomatic approach. New York: Cambridge University Press; 2021. 442 p. DOI: 10.48550/arXiv.2012.02113.
  4. Mazur M. Qualitative information theory. M.: Mir; 1974. 238 p. (in Russian). (Jakosciova teoria informacji. Warshawa. Widawnictwa Naukowo tehniczne. 1970.)
  5. Kolmogorov AN. Combinatorial foundations of information theory and the calculus of probabilities. Russian Mathematical Surveys. 1983;38(4):29–40. DOI: 10.1070/rm1983v038n04abeh004203.
  6. Chernavsky DS. Synergetics and information. M.: Nauka; 2001. 304 p. (in Russian).
  7. Bates MJ. Concepts for the Study of Information Embodiment. Library Trends. 2018;66(3):239– 266. DOI: 10.1353/lib.2018.0002.
  8. Adriaans P. A Critical Analysis of Floridi’s Theory of Semantic Information. Knowledge, Technology & Policy. 2010;23:41–56. DOI: 10.1007/s12130-010-9097-5.
  9. Floridi L. What is the philosophy of information? Metaphilosophy. 2002;33:1–2;123–145. DOI: 10.1111/1467-9973.00221.
  10. Gang L. Philosophy of information and the foundations of the future Chinese philosophy of science and technology. Questions of philosophy. 2007;5:45–57 (in Russian).
  11. Adriaans P, van Benthem J. Philosophy of information (Handbook of the philosophy of science). North Holland, 2008. 1000 p.
  12. Colin KK. Philosophy of information: the structure of reality and the phenomenon of information. Metaphysics. 2013;4(10):61–84 (in Russian).
  13. Sequoiah-Grayson S. The Metaphilosophy of Information. Minds and Machines. 2007;17:331–344. DOI: 10.1007/s11023-007-9072-4.
  14. Mingers J., Standing C. What is information? Toward a theory of information as objective and veridical. Journal of Information Technology. 2018;33(2):85–104. DOI: 10.1057/s41265- 017-0038-6.
  15. Diaz Nafria J. What is information? A Multidimensional Concern. TripleC. 2010;8(1):77–108. DOI: 10.31269/triplec.v8i1.76.
  16. Crnkovic G, Hofkirchner W. Floridi’s “Open Problems in Philosophy of Information”, Ten Years Later. Information. 2011;2(2):327–359. DOI: 10.3390/info2020327.
  17. Robinson L, Bawden D. Mind the Gap: Transitions between concepts of information in varied domains. Theories of Information, Communication and Knowledge. 2014;34:121–141. DOI: 10.1007/ 978-94-007-6973-1_6.
  18. Lektorsky VA, Pruzhinin BI, Bodyakin VI, Dubrovsky DI, Kolin KK, Melik-Gaikazyan IV, Ursul AD. The information approach in interdisciplinary prospect (a round-table discussion). Russian studies in philosophy. 2010;2:84–122 (in Russian).
  19. Zins C. Conceptual Approaches to Defining Data, Information and Knowledge. Journal of the American Society for Information Science and Technology. 2007;58(4):479–493. DOI: 10.1002/ asi.20508.
  20. Liew A. Understanding Data, Information, Knowledge And Their Inter-Relationships. Journal of Knowledge Management Practice. 2007;8(2).
  21. Capurro R, Hjorland B. The Concept of Information. Annual Review of Information Science and Technology. 2003;37(1):343–411. DOI: 10.1002/aris.1440370109.
  22. Beynon-Davies P. Significance: Exploring the nature of information, systems and technology. London: Palgrave Macmillan, 2010. 355 p.
  23. Callaos N, Callaos B. Toward a Systemic Notion of Information: Practical Consequences. Informing Science. 2002;5(1):1–11. DOI: 10.28945/532.
  24. Vigo R. Representational information: a new general notion and measure of information. Information Sciences. 2011;181(21):4847–4859. DOI: 10.1016/j.ins.2011.05.020.
  25. Deutsch D, Maretto C. Constructor theory of information. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences. 2015;471(2174):20140540. DOI: 10.1098/ rspa.2014.0540.
  26. Dittrich T. “The concept of information in physics”: an interdisciplinary topical lecture. European Journal of Physics. 2015;36(1):015010. DOI: 10.1088/0143-0807/36/1/015010.
  27. Clifton R, Bub J, Halvorson H. Characterizing quantum theory in terms of information-theoretic constraints. Foundations of Physics. 2003;33:1561–1591. DOI: 10.1023/A:1026056716397.
  28. Morrison ML, Rosenberg NA. Mathematical bounds on Shannon entropy given the abundance of the ith most abundant taxon. Journal of Mathematical Biology. 2023;87:76. DOI: 10.1007/s00285- 023-01997-3.
  29. Cushman SA. Entropy in landscape ecology: a quantitative textual multivariate review. Entropy. 2021;23(11):1425. DOI: 10.3390/e23111425.
  30. Belyaev MA, Malinina LA, Lysenko VV. Fundamentals of Computer Science: Textbook for Universities. M.: Phoenix, 2006. 352 p. (in Russian).
  31. Simonovich SV. Informatics. Basic course: Textbook for Universities. 3rd ed. Third generation standard. St. Petersburg: Peter, 2011. 640 p. (in Russian).
  32. Makarova NV, Volkov VB. Computer science: Textbook for Universities. St. Petersburg: Peter, 2011. 576 p (in Russian).
  33. Nielsen M, Chuang I. Quantum Computation and Quantum Information. Cambridge: Cambridge University Press, 2000. 702 p. DOI: 10.1017/CBO9780511976667.
  34. Zadeh LA. Outline of a new approach to the analysis of complex systems and decision processes. IEEE Transactions on Systems, Man and Cybernetics. 1973;SMC-3(1):28–44. DOI: 10.1109/ TSMC.1973.5408575.
  35. Zadeh LA. Fuzzy sets. Information and Control. 1965;8(3):338–353. DOI: 10.1016/S0019- 9958(65)90241-X.
  36. Kuzenkov O, Morozov A. Towards the Construction of a Mathematically Rigorous Framework for the Modelling of Evolutionary Fitness. Bulletin of Mathematical Biology. 2019;81(11):4675–4700. DOI: 10.1007/s11538-019-00602-3.
  37. Perffl’eva IG. Applications of the theory of fuzzy sets. Journal of Soviet Mathematics. 1992;58(2): 148–194. DOI: 10.1007/BF01097427.
  38. Kuzenkov O, Ryabova E. Variational principle for self-replicating systems. Mathematical Modelling of Natural Phenomena. 2015;10(2):115–128. DOI: 10.1051/mmnp/201510208.
  39. Kuzenkov OA, Novozhenin AV. Optimal control of measure dynamic. Communications in Nonlinear Science and Numerical Simulation. 2015;21(1-3):159–171. DOI: 10.1016/j.cnsns. 2014.08.024.
  40. Sandhu S, Morozov A, Kuzenkov O. Revealing evolutionarily optimal strategies in self-reproducing systems via a new computational approach. Bulletin of Mathematical Biology 2019;81(11):4701– 4725. DOI: 10.1007/s11538-019-00663-4.
  41. Muller I. A History of Thermodynamics: The Doctrine of Energy and Entropy. Berlin: Springer, 2007. 330 p. DOI: 10.1007/978-3-540-46227-9.
  42. Shu JJ. A new integrated symmetrical table for genetic codes. Biosystems. 2017;151:21–26. DOI: 10.1016/j.biosystems.2016.11.004.
  43. Morozov AY., Kuzenkov OA, Sandhu SK. Global optimisation in hilbert spaces using the survival of the fittest algorithm. Communications in Nonlinear Science and Numerical Simulation. 2021;103:106007. DOI: 10.1016/j.cnsns.2021.106007.
  44. Yao X, Liu Y, Lin G. Evolutionary programming made faster. IEEE Transactions on Evolutionary Computation. 1999;3(2):82–102. DOI: 10.1109/4235.771163.
  45. Kuzenkov O, Morozov A, Kuzenkova G. Recognition of patterns of optimal diel vertical migration of zooplankton using neural networks. IJCNN 2019 — International Joint Conference on Neural Networks, Budapest. Hungary. 2019. P. 1–6. DOI: 10.1109/IJCNN.2019.8852060.
  46. Kuzenkov O, Kuzenkova G. Identification of the fitness function using neural networks. Procedia Computer Science. 2020;169(692–697). DOI: 10.1016/j.procs.2020.02.179.
  47. Casagrande D. Information as verb: Re-conceptualizing information for cognitive and ecological models. Journal of Ecological Anthropology. 1999;3(1):4–13. DOI: 10.5038/2162-4593.3.1.1.
  48. Dusenbery DB. Sensory Ecology. New York: Freeman, 1992. 558 p.
  49. Zade LA. The concept of a linguistic variable and its application to approximate reasoning—I. Information Sciences. 1975;8(3):199–249. DOI: 10.1016/0020-0255(75)90036-5.
  50. Kuzenkov O, Kuzenkova G, Kiseleva T. Computer support of training and research projects in the field of mathematical modeling of selection processes. Educational technologies and society. 2019;22(1):152–163 (in Russian).
  51. Kuzenkov OA. Studying the Concept of Information by IT-students. Modern Information Technologies and IT-Education. 2023;19(1):13–23 (in Russian). DOI: 10.25559/SITITO.019. 202301.013-023.
Received: 
03.12.2023
Accepted: 
11.03.2024
Available online: 
27.05.2024