350 rub
Journal Neurocomputers №2 for 2023 г.
Article in number:
Automated neural network system for analyzing the emotional state of a person-operator
Type of article: scientific article
DOI: https://doi.org/10.18127/j19998554-202302-04
UDC: 004.89
Authors:

B.K. Abdulaev1, A.I. Vlasov2, T.M. Fatkhutdinov3

1–3 Bauman Moscow State Technical University (Moscow, Russia)

Abstract:

Problem setting. Emotional manifestations play an important role in the social model of an individual's life. The analysis of his emotional state makes it possible to track the change in behavioral aspects, to assess the attitude to the events taking place. Human emotions affect cognitive processes and the quality of decisions made. Modern emotion recognition technologies are used in solving various technical and social tasks, for example, they allow automating the process of monitoring the quality of customer service of call centers, evaluating the reaction of a human operator to the impact of external factors, etc. Due to the fact that automation of the process of determining emotional reactions is becoming increasingly practical, there is a need to form an integrated approach to recognizing the emotional state of a human operator based on the results of a multi-criteria analysis.

Target. To form an integrated approach to solving the problem of recognizing the emotional state of a human operator based on the results of a multi-criteria analysis.

Results. The analysis of the most common tools for recognizing a person's emotional state is carried out: Face Reader Noldus, Emo Detect, Face Security, Microsoft Oxford Project Emotion Recognition, eMotion Software. The methods used to recognize the emotional state of a person by the expression of his face are generalized, such as the method of basic components, the Viola-Jones method, pattern comparison, the Hopfield neural network, the method based on localization of key points on the face and the method based on texture information. A solution for multi-criteria recognition of the emotional state of a human operator based on the use of neural network algorithms and deep learning is proposed. The prospects of using various variants of the implementation of emotional analyzers are shown.

Practical significance. The developed integrated approach can be used to develop various digital applications in information support systems, starting from the analysis of the psychophysiological and emotional state of a human operator (pilot, driver, etc.), ending with multimedia mobile applications for the analysis of the emotional state of the interlocutor. The trend towards remote work and self-employment makes the developed approach relevant in the field of human resource management of companies and organizations and can be useful when consulting psychologists and conducting interviews online.

Pages: 41-57
For citation

Abdulaev B.K., Vlasov A.I., Fatkhutdinov T.M. Automated neural network system for analyzing the emotional state of a person-operator. Neurocomputers. 2023. V. 25. № 2. Р. 41-57. DOI: https://doi.org/10.18127/j19998554-202302-04 (In Russian)

References
  1. Shakhnov V.A., Kurnosenko A.E. Modeling of digital production of electronic equipment within the framework of the concept "Industry 4.0". Materials of the I International Scientific and Practical Conference "Digital transformation of industry: trends, management, strategies". 2019. P.585–594. (In Russian)
  2. Burkov V.N., Novikov D.A. Theory of active systems (History of development and modern state). Control problems. 2009. № 3.1. P. 28–35. (In Russian)
  3. Vlasov A.I. Current state and trends in the development of the theory and practice of active suppression of wave fields. Instruments and control systems. 1997. № 12. P. 59–70. (In Russian)
  4. Proceedings of an international scientific and practical conference in two volumes "Theory of active systems". General edition V.N. Burkov, D.A. Novikov. M.: IPU RAS. 2001. V. 1. 182 p. (In Russian)
  5. Vlasov A.I. Visual modeling of complex systems, taking into account the role of active components. AIP Conference Proceedings. 2023.
  6. Vasilova E.V., Vlasov A.I., Evdokimov G.M. Nonverbal communication of the animal world: system analysis of sign languages. International Research Journal. 2017. № 5-3 (59). P. 14–23. (In Russian)
  7. Vasilova E.V., Vlasov A.I., Evdokimov G.M. Nonverbal communications of the animal world: mapping elements of sign languages. International Research Journal. 2017. № 6-3 (60). P. 102–110. (In Russian)
  8. Orechov A.N. Solving non-standard problems with a computer system. Neurocomputers: development, application. 2021. V. 23. № 3. P. 43−62. DOI: https://doi.org/10.18127/j19998554-202103-05. (In Russian)
  9. Vlasov A.I., Larionov I.T., Orekhov A.N., Tetik L.V. System of automatic analysis of methods for recognizing the emotional state of a person. Neurocomputers: development, application. 2021. V. 23. № 5. P. 33–50. DOI: https://doi.org/10.18127/j19998554-202105-03. (In Russian)
  10. Vlasov A.I., Konkova A.F. Medical and diagnostic expert systems for assessing the adequacy of the body's adaptive response to the effects of extreme factors. Conversion. 1995. № 9-10. P. 18–21. (In Russian)
  11. Ekman P. Psychology of lies. St. Petersburg: Peter. 2000. 272 p. (In Russian)
  12. Mitina G.V., Nugaeva A.N., Shurukhina G.A. Psychology of emotions and motivation. Ufa: BSPU, 2020. 110 p. (In Russian)
  13. Plumper J. History of Emotions. M.: New Literary Review. 2018. 568 p. (In Russian)
  14. Leontiev V.O. Classification of emotions. Odessa: Innovation and Mortgage Center. 2002. 84 p.
  15. Krivonos Yu.G. Modeling and analysis of mimic manifestations of emotions. Additional NASU, 2008. № 12. P. 51–55.
  16. Minenko A.S., Vanzha T.V. System for recognizing the emotional state of a person. Problems of artificial intelligence. 2020. № 3(18). P. 60–69. (In Russian)
  17. Breazieal P., Washeef A. Robots Emotion: A functional perspective. Who Need Emotions: The Brain Meet the Robots. MIT Press. 2003. P.138–169.
  18. Echeagaray-Patron B.A., Kober V.I. Method of face recognition using three-dimensional surfaces. Information processes. 2016. V. 16. № 3. P. 170–178. (In Russian)
  19. Korshunova S.G., Stepanova O.B., Tetik L.V. Spherical model of the space of emotional expression of the face, based on perceived differences. Neurocomputers: development, application. 2012. № 2. P. 42–53. (In Russian)
  20. Korshunova S.G., Stepanova O.B. Linguistic organization of the structure of perception of the tone of mixed emotions expressed by facial expressions. Questions of psychology. 2018. № 6. P. 121–133. (In Russian)
  21. Korshunova S.G., Stepanova O.B. Differentiation of facial expressions and vowel sounds of the Russian language in visual and auditory perception: emotional space of faces. Neurocomputers: development, application. 2014. № 8. P.39–45. (In Russian)
  22. Kozharinov A.S., Kirichenko Yu.A., Afanasyev I.V., Vlasov A.I., Labuz N.P. Methods Analysis of cognitive searches and concepts of automation of intelligent systems of their detective analysis. Neurocomputers: development. 2022. V. 24. № 4. P. 38–74. DOI: https://doi.org/10.18127/j19998554-202204-04. (In Russian)
  23. Mishchenkova E.S. Algorithms for comparative analysis of the recognition of litts. Bulletin of Volgograd State University. Series 9: Youth Three. 2015. № 11. P. 75–78. (In Russian)
  24. Turk M., Pentlande A. Eigenfaces for recognition. Journal of Cognitive Neuroscience. 1998. 4. P 72–88.
  25. Yang X. Automatic expression of emotions for the organization of an intellectual interface. Youth Scientific and Technical Bulletin. 2013. No. 9. P. 32. (In Russian)
  26. Batlett M.S., Haget J.C, Ekman P., Secniwskie T.J. Measuring facial expressions by computer image analysis. Cambridge University Press. publc. 2000. P. 254–265.
  27. Chandran S., Washeef A., Somar M., Debasis M. Facial Expressions: A Cross Cultural Study. Emotion Recognition: A Pattern Analysis Approach. Wiley. 2016. P. 89.
  28. Tukhtasinov M.T., Rajabov S.S. Algorithms of face recognition based on local directional patterns. Problems of computational and applied mathematics. 2016. № 5 (5). P. 101–106. (In Russian)
  29. Mian A.S., Benamoun M., Owens R. Keypoint detection and local feature matching for textured face recognition. Int. Comput. Vis. 2011. V. 80. No. 1. P. 1–13.
  30. Zaboleeva A.V. Development of the system of automated determination of emotions and possible areas of application. Open education, 2012. No. 3. P. 60–63. (In Russian)
  31. Viola P. Rapid object detection using a boosted cascade of simple features. IEEE Conf. on Computer Vision and Pattern Recognition. Kauai, Hawaii. 2008. V. 1. P. 513–520.
  32. Viola-Jones method as the basis for facial recognition. Electronic resource. Access mode: https://habrahabr.ru/post/133826/, date of reference 11.08.2021.
  33. Acosta J.C., Ward N.G. Responding to user emotional state by adding emotional coloring to utterances. Proc. Interspeech. 2009. P. 1587-1590.
  34. Kalinovsky I. Introduction to the problem of recognizing emotions. Electronic resource. Access mode: https://habr.com/ru/com­pany/speechpro/blog/418151/, date of reference 02.09.2018. (In Russian)
  35. Timoshkin A.G., Vlasov A.I. On the strategy and tactics of the marketing policy of a multidisciplinary computer company. Devices and control systems. 1996. № 9. P. 59–61. (In Russian)
  36. Vlasov A.I., Zenovkin N.V. Methods of visual control in the implementation of Polish interfaces. Software products and systems. 2011. № 1. P. 23–26. (In Russian)
  37. Mukhamadieva K.B. Algorithms of comparative analysis of creation of litts. Modern materials, technology and technologies. 2017. №7. P. 58–63. (In Russian)
  38. Sablina V.A., Savin A.V. Poetry of anthropometric points litsa s pomoschyu OPENFACE and MEDIAPIPE. Modern technology in science and image. 2021. P. 107–111. (In Russian)
  39. Hasani B., Mahoor M.H. Facial Expression Recognition Using Enhanced Deep 3D Convolutional Neural. arXiv:1705.07871v1. 2017. 05. N. 1705.07871.
  40. Li Y. Deep Learning of Human Emotion Recognition in Videos. Electronic resource. Access mode: https://www.divaportal.org/smash/ get/diva2:1174434/FULLTEXT01.pdf, date of reference 21.07.2021.
Date of receipt: 17.02.2023
Approved after review: 02.03.2023
Accepted for publication: 20.03.2023