300 rub
Journal Neurocomputers №5 for 2021 г.
Article in number:
System of automatic analysis of methods for recognizing the emotional state of a person
Type of article: scientific article
DOI: https://doi.org/10.18127/j19998554-202105-03
UDC: 004.8
Authors:

A.I. Vlasov1, I.T. Larionov2, A.N. Orekhov3, L.V. Tetik4

1, 2 Bauman Moscow State Technical University (Moscow, Russia)

3 Foundation for Assistance to the Creation and Implementation of Computer Psyche (Moscow, Russia)

4 Faculty of Psychology, Moscow State University M.V. Lomonosov (Moscow, Russia)

Abstract:

The introduction of methods and means of digital transformation of industry and the social sphere poses new challenges. One of these tasks is the management of active systems, in which simple registration and identification of the initiators of actions is not enough, but a deeper assessment of their state, including psychophysical and emotional, is required. The article is devoted to the analysis of methods and means of recognizing the emotional state of a person. Approaches to automated recognition of a person's emotional state based on primary, secondary and more complex features are analyzed. The main focus is on a comprehensive approach to recognition of a person's emotional state based on the analysis of visual and audio channels using neural networks and computer psyche algorithms.

The target of the article is a formalization of methods and development of means to recognize a person's emotional state according to complex audiovisual criteria.

Analyzed software for recognizing the emotional state of a person: FaceReaderNoldus, EmoDetect, FaceSecurity, Microsoft Oxford Project Emotion Recognition, eMotion Software, MMER_FEASy. Methods used to recognize the emotional state of a person by his face have been investigated, such as: the method of basic components, the Viola-Jones method, template comparison, the Hopfield neural network, the method based on the localization of key points on the face and the method based on texture information. Separately analyzed methods of recognizing the emotional state of a person from his speech. invention proposes a solution for multilevel recognition of emotional state of a person based on using algorithms of neural networks and computer psyche.

Software for recognizing the emotional state of a person was analyzed: the developed approach can be used for various digital applications, ranging from the analysis of the psychophysiological and emotional state of a person - an operator (pilot, driver, etc.), to multimedia mobile applications for analyzing the emotional state of an interlocutor. The trend for remote work and self-employment opens up such areas of application of these applications as consulting psychologists, interviewing online, optimizing the human resources of companies and organizations. Prospects for using the latest versions of emotional analyzers are shown.

Pages: 33-50
For citation

Vlasov A.I., Larionov I.T., Orekhov A.N., Tetik L.V. System of automatic analysis of methods for recognizing the emotional state of a person. Neurocomputers. 2021. V. 23. № 5. Р. 33−50. DOI: https://doi.org/10.18127/j19998554-202105-03. (In Russian).

References
  1. Shakhnov V.A., Kurnosenko A.E. Modelirovaniye tsifrovogo proizvodstva elektronnoy apparatury v ramkakh kontseptsii "Industriya 4.0". Cb. materialov I Mezhdunar. nauch.-prakt. konf. «Tsifrovaya transformatsiya promyshlennosti: tenden-tsii. upravleniye. strategii». 2019. S. 585–594. (in Russian).
  2. Burkov V. N., Novikov D. A. Teoriya aktivnykh sistem (Istoriya razvitiya i sovremennoye sostoyaniye). Problemy upravleniya. 2009. № 3.1. S. 28–35. (in Russian).
  3. Vlasov A.I., Zhuravleva L.V., Kazakov V.V. Primeneniye vizualnykh instrumentov BPMN dlya modelirovaniya tekhnologi-cheskoy podgotovki proizvodstva (obzor). Informatsionnyye tekhnologii v proyektirovanii i proizvodstve. 2020. № 1 (177). S. 14–26. (in Russian).
  4. Vlasov A.I. Sovremennoye sostoyaniye i tendentsii razvitiya teorii i praktiki aktivnogo gasheniya volnovykh poley. Pribory i sistemy upravleniya. 1997. № 12. S. 59–70. (in Russian).
  5. Burkov V.N., Novikov D.A. Teoriya aktivnykh sistem. Tr. mezhdunar. nauch.-prakt. konf. (v 2-kh tomakh) M.: IPU RAN. 2001. T. 1. 182 s. (in Russian).
  6. Mitina G.V., Nugayeva A.N., Shurukhina G.A. Psikhologiya emotsiy i motivatsii: ucheb.-metod. posobiye. Ufa: Izd-vo BGPU. 2020. 110 s. (in Russian).
  7. Vasilova E.V., Vlasov A.I., Evdokimov G.M. Neverbalnyye kommunikatsii zhivotnogo mira: sistemnyy analiz zhestovykh yazykov. Mezhdunarodnyy nauchno-issledovatelskiy zhurnal. 2017. № 5–3 (59). S. 14–23. (in Russian).
  8. Vasilova E.V., Vlasov A.I., Evdokimov G.M. Neverbalnyye kommunikatsii zhivotnogo mira: kartirovaniye elementov zhe-stovykh yazykov. Mezhdunarodnyy nauchno-issledovatelskiy zhurnal. 2017. № 6–3 (60). S. 102–110. (in Russian).
  9. Plamper Ya. Istoriya emotsiy. M.: Novoye literaturnoye obozreniye. 2018. 568 s. (in Russian).
  10. Korshunova S.G., Stepanova O.B., Tetik L.V. Sfericheskaya model prostranstva emotsionalnogo vyrazheniya litsa. osno-vannaya na vosprinimayemykh razlichiyakh. Neyrokompyutery: razrabotka. primeneniye. 2012. № 2. S. 42–53. (in Russian).
  11. Korshunova S.G., Stepanova O.B. Yazykovaya organizatsiya struktury vospriyatiya tona smeshannykh emotsiy. vyrazhennykh litsevymi ekspressiyami. Voprosy psikhologii. 2018. № 6. S. 121–133. (in Russian).
  12. Korshunova S.G., Stepanova O.B. Differentsiatsiya litsevykh ekspressiy i glasnykh zvukov russkogo yazyka v zritelno-slukhovom vospriyatii: emotsionalnoye prostranstvo lits. Neyrokompyutery: razrabotka. primeneniye. 2014. № 8. S. 39–45. (in Russian).
  13. Rasskazova S.I. Metod formantnogo analiza na osnove veyvlet-preobrazovaniya v sistemakh raspoznavaniya rechi. Sb. IX Molodezh. nauchn.-tekhnich. konf. «Naukoyemkiye tekhnologii i intellektualnyye sistemy» 2007. S. 38–43. (in Russian).
  14. Orekhov A.N. Modelirovaniye psikhicheskikh i sotsialno-psikhologicheskikh protsessov: nomoteticheskiy podkhod: Avtoref. diss. … dokt. psikhol. nauk. M.: 2006. (in Russian).
  15. Ekman P. Psikhologiya lzhi. S-Pb: Piter. 2000. 272s. (in Russian).
  16. Leontyev V. O. Klassifikatsiya emotsiy. Odessa: Innovatsionno-ipotechnyy tsentr. 2002. 84 s. (in Russian).
  17. Krivonos Yu.G., Krak Yu.V., Efimov G.M. Modelirovaniye i analiz mimicheskikh proyavleniy emotsiy. Dopovshch NANU. 2008. № 12.  S. 51–55. (in Russian).
  18. Vlasov A.I., Konkova A.F. Mediko-diagnosticheskiye ekspertnyye sistemy dlya otsenki adekvatnosti adaptivnoy reaktsii organizma na vozdeystviye ekstremalnykh faktorov. Konversiya. 1995. № 9–10. S. 18–21. (in Russian).
  19. Minenko A.S., Vanzha T.V. Sistema raspoznavaniya emotsionalnogo sostoyaniya cheloveka. Problemy iskusstvennogo intellekta. 2020. №3 (18). S. 60–69. (in Russian).
  20. Breazieal P., Washeef A Robots Emotion: A functional perspective. Who Need Emotions: The Brain Meet the Robots. MIT Press. 2003. P.138–169.
  21. Echeagaray-Patron B.A., Kober V.I. Metod raspoznavaniya lits s ispolzovaniyem trekhmernykh poverkhnostey. Informatsionnyye protsessy. 2016. T. 16. № 3. C. 170-178. (in Russian).
  22. Yan Si Avtomaticheskiye raspoznavaniye emotsiy polzovatelya dlya organizatsii intellektualnogo interfeysa. Molodezhnyy nauchnotekhnicheskiy vestnik 2013. №2 (4) s. 51 (in Russian).
  23. Batlett M.S., Haget J.C. Ekman P., Sejniwskie T.J. Measuring facial expressions by computer image analysis. Cambridge University Press. 2000. P. 254–265.
  24. Sistemy videonablyudeniya [Elektronnyy resurs] – URL: http://www.vocord.ru/company/VOCORDsystem (data obrashcheniya: 08.08.2021) (in Russian).
  25. Chandran S., Washeef A., Somar M., Debasis M. Facial Expressions: A Cross Cultural Study. in: «Emotion Recognition: A Pattern Analysis Approach». Wiley. 2016. 89 p.
  26. Face Recognition with Local Binary Patterns [Elektronnyy resurs] – URL: http://uran.donetsk.ua/~masters/2011/frt/dyrul/ library/article8.pdf (data obrashcheniya: 07.08.2021)
  27. Mishchenkova E.S. Sravnitelnyy analiz algoritmov raspoznavaniya lits. Vestnik Volgogradskogo gosudarstvennogo uni-versiteta. Seriya 9: Issledovaniya molodykh uchenykh. 2015. № 11. S. 75–78. (in Russian).
  28. Turk M., Pentlande A. Eigenfaces for recognition. Journal of Cognitive Neuroscience. 1998. V. 4. P. 72–88.
  29. Viola P. Rapid object detection using a boosted cascade of simple features. IEEE Conf. on Computer Vision and Pattern Recognition. 2008. V. 1. P. 513–520.
  30. Metod Violy-Dzhonsa (Viola-Jones) kak osnova dlya raspoznavaniya lits. [Elektronnyy resurs] – URL: https://habrahabr.ru/ post/133826/ (data obrashcheniya: 11.08.2021) (in Russian).
  31. Tukhtasinov M.T., Radzhabov S.S. Algoritmy raspoznavaniya lits na osnove lokalnykh napravlennykh shablonov. Pro-blemy vychislitelnoy i prikladnoy matematiki. 2016. № 5 (5). s. 101–106. (in Russian).
  32. Zaboleyeva A. V. Razvitiye sistemy avtomatizirovannogo opredeleniya emotsiy i vozmozhnyye sfery primeneniya. Otkry-toye obrazovaniye. 2012. № 3. S. 60–63. (in Russian).
  33. Orekhov A.N. Resheniye nestandartnykh zadach kompyuternoy sistemoy. Neyrokompyutery: razrabotka. primeneniye. 2021. T. 23. № 3. S. 43?62. DOI: https://doi.org/10.18127/j19998554-202103-05 (in Russian).
  34. Mian A. S., Benamoun M., Owens R. Keypoint detection and local feature matching for textured face recognition. Int. Comput. Vis. 2011. Vol. 80. No. 1. p. 1–13.
  35. Dlib [Elektronnyy resurs] URL: http://dlib.net/ (data obrashcheniya: 13.08.2021)
  36. Gabor D. Theory of communication. Part 1: The analysis of information. Journal of the IEE. 2005. V. 93. № 27. P. 430–440.
  37. Baza biometricheskikh dannykh i algoritmy raspoznavaniya [Elektronnyy resurs] – URL: http://biometrics.idealtest.org/ datasets/1/1000/base/3450 (data obrashcheniya: 15.08.2021) (in Russian).
  38. Galactic Messenger Network [Elektronnyy resurs] – URL: www.galactic.org (data obrashcheniya: 16.08.2021)
  39. Acosta J.C., Ward N.G. Responding to User Emotional State by Adding Emotional Coloring to Utterances
  40. International Speech Communication Association [Elektronnyy resurs] – URL: www.isca-speech.org (data obrashcheniya 14.08.2021).
Date of receipt: 17.08.2021
Approved after review: 27.08.2021
Accepted for publication: 24.09.2021