350 rub
Journal Neurocomputers №4 for 2025 г.
Article in number:
Development of a face image database for facial emotion recognition
Type of article: scientific article
DOI: https://doi.org/10.18127/j19998554-202504-03
UDC: 004.89
Authors:

B.Kh. Abdulaev1, A.I. Vlasov2, T.M. Fatkhutdinov3
1– 3 Bauman Moscow State Technical University (Moscow, Russia)

1 batal990@mail.ru, 2 vlasov@iu4.ru, 3 fatkhutdinovtm@gmail.com

Abstract:

The global digital transformation affects all spheres of human activity, posing new challenges. One of such challenges is to improve the quality of industrial processes and human safety during his labour activity, by tracking the emotional state of a person, for timely prevention of potential risk. The article is devoted to the analysis of problems of recognition of emotions by human face in the development of a database of face images for facial emotion recognition. The main attention is paid to the most effective methods of facial emotion recognition, their breakdowns and the logic of image selection for database development.

The aim of this paper is to develop a face image database for facial emotion recognition. In the course of database development, seven people's pictures have been collected, for each of the six basic emotions. For two people, the instances of their profile in the database can be considered complete and meet the specified database quality metrics. For five people, the photographs in their profile instances have been partially collected and have an average database quality score, according to the given quality scores. Collectively, the collected database has above average quality indicators and is considered 40% complete.

On the basis of the proposed database of images it is possible to determine with high probability the emotion on the human face, as the catalogue takes into account different angles of the human face and has an extensive range of variations of expression of the same emotion, which will allow the most accurate analysis of the recorded emotion by means of multiple comparisons with the database of images of emotions on the human face.

Pages: 24-44
For citation

Abdulaev B.Kh., Vlasov A.I., Fatkhutdinov T.M. Development of a face image database for facial emotion recognition. Neurocomputers. 2025. V. 27. № 4. P. 24–44. DOI: https://doi.org/10.18127/j19998554-202504-03 (in Russian)

References
  1. Leont'ev V.O. Klassifikatsiya emotsij. Odessa: Innovatsionno-ipotechnyj tsentr. 2002. (in Russian)
  2. Krivonos Yu.G., Krak Yu.V., Efimov G.M. Modelirovanie i analiz mimicheskikh proyavlenij emotsij. Dopovshch NANU. 2008. № 12. S. 51–55. (in Russian)
  3. Videoanalitika i raspoznavanie lits [Elektronnyj resurs]. URL: https://video-praktik.ru/st_videoanalitika.html/ (data obrashcheniya: 15.09.2022).
  4. Korshunova S.G., Stepanova O.B., Tetik L.V. Sfericheskaya model' prostranstva emotsional'nogo vyrazheniya litsa, osnovannaya na vosprinimaemykh razlichiyakh. Nejrokomp'yutery: razrabotka, primenenie. 2012. № 2. S. 42–53. (in Russian)
  5. Metod opornykh vektorov [Elektronnyj resurs]. URL: https://ru.wikipedia.org/wiki/Metod_opornykh_vektorov (data obrashcheniya: 15.05.2022).
  6. Orekhov A.N. Modelirovanie psikhicheskikh i sotsial'no-psikhologicheskikh protsessov: nomoteticheskij podkhod. Avtoref. diss. … dokt. psikhol. nauk. M. 2006. (in Russian)
  7. Vlasov A.I., Larionov I.T., Orekhov A.N., Tetik L.V. Sistema avtomaticheskogo raspoznavaniya emotsional'nogo sostoyaniya cheloveka. Nejrokomp'yutery: razrabotka, primenenie. 2021. T. 23. № 5. S. 33–50. (in Russian)
  8. Metod Violy–Dzhonsa [Elektronnyj resurs]. URL: https://ru.wikipedia.org/wiki/Metod_Violy–Dzhonsa (data obrashcheniya: 5.08.2022).
  9. Vasilova E.V., Vlasov A.I., Evdokimov G.M. Neverbal'nye kommunikatsii zhivotnogo mira: sistemnyj analiz zhestovykh yazykov. Mezhdunarodnyj nauchno-issledovatel'skij zhurnal. 2017. № 5-3 (59). S. 14–23. (in Russian)
  10. Vasilova E.V., Vlasov A.I., Evdokimov G.M. Neverbal'nye kommunikatsii zhivotnogo mira: kartirovanie elementov zhestovykh yazykov. Mezhdunarodnyj nauchno-issledovatel'skij zhurnal. 2017. № 6-3 (60). S. 102–110. (in Russian)
  11. Echeagaraj-Patron B.A., Kober V.I. Metod raspoznavaniya lits s ispol'zovaniem trekhmernykh poverkhnostej. Informatsionnye protsessy. 2016. T. 16. № 3. C. 170–178. (in Russian)
  12. Mian A.S., Benamoun M., Owens R. Keypoint detection and local feature matching for textured face recognition. International Journal of Computer Vision. 2011. V. 80. № 1. P. 1–13.
  13. Dlib [Elektronnyj resurs]. URL: http://dlib.net/ (data obrashcheniya: 15.05.2022).
  14. Gabor D. Theory of communication. Part 1: The analysis of information. Journal of the IEE. 2005. V. 93. № 27. P. 430–440.
  15. Baza biometricheskikh dannykh i algoritmy raspoznavaniya [Elektronnyj resurs]. URL: http://biometrics.idealtest.org/datasets/1/1000/ base/3450 (data obrashcheniya: 15.05.2022).
  16. Galactic Messenger Network [Elektronnyj resurs]. URL: www.galactic.org (data obrashcheniya: 15.05.2022).
  17. MindMaster [Elektronnyj resurs]. URL: https://www.edrawsoft.com/downloadmindmaster.html/ (data obrashcheniya: 15.05.2022).
  18. International Speech Communication Association [Elektronnyj resurs]. URL: www.isca-speech.org (data obrashcheniya: 15.05.2022).
  19. Tukhtasinov M.T., Radzhabov S.S. Algoritmy raspoznavaniya lits na osnove lokal'nykh napravlennykh shablonov. Problemy vychislitel'noj i prikladnoj matematiki. 2016. № 5 (5). S. 101–106. (in Russian)
  20. Zaboleeva A.V. Razvitie sistemy avtomatizirovannogo opredeleniya emotsij i vozmozhnye sfery primeneniya. Otkrytoe obrazovanie. 2012. № 3. S. 60–63. (in Russian)
  21. Mishchenkova E.S. Sravnitel'nyj analiz algoritmov raspoznavaniya lits. Vestnik Volgogradskogo gosudarstvennogo universiteta. Ser. 9: Issledovaniya molodykh uchenykh. 2015. № 11. S. 75–78. (in Russian)
  22. Turk M., Pentlande A. Eigenfaces for recognition. Journal of Cognitive Neuroscience. 1998. V. 4. P. 72–88.
  23. Brilyuk D.V., Starovojtov V.V. Raspoznavanie cheloveka po izobrazheniyu litsa nejrosetevymi metodami. Izd. 2-e. Minsk: Institut tekhnicheskoj kibernetiki NAN Belarusi. 2002. (in Russian)
  24. Analiz sushchestvuyushchikh podkhodov k raspoznavaniyu lits [Elektronnyj resurs]. URL: https://habr.com/ru/company/synesis/blog/ 238129/ (data obrashcheniya: 15.05.2022).
  25. Nezami O.M., Dras M., Hamey L. et al. Automatic recognition of student engagement using deep learning and facial expression. Conference «Machine Learning and Knowledge Discovery in Databases». 2020. P. 273–289.
  26. Churchix face recognition software [Elektronnyj resurs]. URL: https://churchix.com/facerecognition-software-features/ (data obrashche­niya: 15.05.2022).
  27. Ekman P. Psikhologiya lzhi. SPb.: Piter. 2000. (in Russian)
  28. Minenko A.S., Vanzha T.V. Sistema raspoznavaniya emotsional'nogo sostoyaniya cheloveka. Problemy iskusstvennogo intellekta. 2020. № 3 (18). C. 60–69. (in Russian)
  29. Abdulaev A.B., Vlasov A.I., Fatkhutdinov T.M. Avtomatizirovannaya nejrosetevaya sistema analiza emotsional'nogo sostoyaniya cheloveka-operatora. Nejrokomp'yutery: razrabotka, primenenie. 2023. T. 25. № 2. S. 41–57. (in Russian)
  30. Ivanova G.S., Malakhov V.A. Analiz vozmozhnostej biblioteki ARKIT dlya raboty s dopolnennoj real'nost'yu na platforme IOS. Tekhnologii inzhenernykh i informatsionnykh sistem. 2022. № 2. S. 3–10. (in Russian)
  31. Samaryov R.S., Ivanov A.G. Analiz arkhitektur nejronnykh setej dlya segmentatsii izobrazhenij. Tekhnologii inzhenernykh i informatsionnykh sistem. 2018. № 3. S. 9–17. (in Russian)
  32. Luchkin F.A., Pavlovskij A.A., Nichushkina T.N. Primenimost' frejmvorkov dlya bystrogo modelirovaniya i vvedeniya v ekspluatatsiyu glubokikh iskusstvennykh nejronnykh setej. Tekhnologii inzhenernykh i informatsionnykh sistem. 2024. № 3. S. 125–135. (in Russian)
  33. Kozharinov A.S., Kirichenko Yu.A., Afanas'ev I.V., Vlasov A.I., Labuz N.P. Metody analiza kognitivnykh iskazhenij i kontseptsiya avtomatizirovannoj intellektual'noj sistemy ikh detektirovaniya. Nejrokomp'yutery: razrabotka, primenenie. 2022. T. 24. № 4. S. 38–74.
Date of receipt: 30.04.2025
Approved after review: 26.05.2025
Accepted for publication: 28.07.2025