ITMO
ru/ ru

ISSN: 1023-5086

ru/

ISSN: 1023-5086

Scientific and technical

Opticheskii Zhurnal

A full-text English translation of the journal is published by Optica Publishing Group under the title “Journal of Optical Technology”

Article submission Подать статью
Больше информации Back

DOI: 10.17586/1023-5086-2022-89-08-97-103

УДК: 159.93

Database of video images of natural emotional facial expressions: perception of emotions and automated analysis of facial structure

For Russian citation (Opticheskii Zhurnal):

Королькова О.А., Лободинская Е.А. База видеоизображений естественных эмоциональных экспрессий: восприятие эмоций и автоматизированный анализ мимики лица // Оптический журнал. 2022. Т. 89. № 8. С. 97–103. http://doi.org/10.17586/1023-5086-2022-89-08-97-103

 

Korolkova O.A., Lobodinskaya E.A. Database of video images of natural emotional facial expressions: perception of emotions and automated analysis of facial structure [in Russian] // Opticheskii Zhurnal. 2022. V. 89. № 8. P. 97–103. http://doi.org/10.17586/1023-5086-2022-89-08-97-103

For citation (Journal of Optical Technology):

O. A. Korolkova and E. A. Lobodinskaya, "Database of video images of natural emotional facial expressions: perception of emotions and automated analysis of facial structure," Journal of Optical Technology. 89(8), 498-501 (2022). https://doi.org/10.1364/JOT.89.000498

Abstract:

Subject of study. The development of methods for automated recognition of emotions on a face requires the creation of new stimuli sets that comprise dynamic natural emotional expressions. To date, no such databases have been designed or validated on the Russian population. Aim of study. The aims of this study were to develop a database of dynamic recordings of natural emotional facial expressions and to compare the profiles of subjective evaluation of these recordings with the results of an automated analysis of facial gestures. Method. Emotional profiles of video images of facial expressions were based on their evaluation by human observers. Study participants (N=250) rated the intensity of 33 differential emotions on a 5-point scale in each of 210 video fragments containing the natural expressions of 5 models. An automated analysis of facial structure in the video fragments of expressions was performed using OpenFace 2.0 to quantify the dynamic changes on the models’ faces. The emotional profiles of video fragments were compared with the results of automated mapping using representational similarity analysis. We calculated the rank correlation coefficient between matrices that represent the structure of subjective evaluation of expressions and their formal description. Additionally, we performed k-means cluster analysis based on subjective evaluation to identify the categorical structure of perceived emotional states. Main results. The representational similarity analysis demonstrated a significant positive correlation between subjective evaluation of expressions and their description in terms of facial actions. However, the correlation was low (0.214), which suggested a substantial variability of mimic patterns that can be subjectively perceived as similar emotions. A cluster analysis revealed five clusters corresponding to basic emotions: attention, joy, surprise, sadness, and disgust. Practical significance. The developed database of natural emotional expressions will be of interest to researchers in the field of affective computing, particularly, for the development of more effective methods for the recognition of users’ emotional states and more accurate simulation of emotional responses in robotic systems.

Keywords:

facial expression recognition, basic emotions, emotion induction, automated analysis of facial images, affective computing

Acknowledgements:

The research was supported by RSF within the scientific project No. 18-18-00350-П "Perception within nonverbal communication".

OCIS codes: 330.5020, 330.5510, 100.2000

References:

1. D’Mello S., Kappas A., Gratch J. The affective computing approach to affect measurement // Emotion Review. 2018. V. 10. № 2. P. 174–183. DOI: 10.1177/1754073917696583
2. Dupré D., Krumhuber E.G., Küster D., McKeown G.J. A performance comparison of eight commercially available automatic classifiers for facial affect recognition // PLOS ONE. 2020. V. 15. № 4. P. e0231968. DOI: 10.1371/JOURNAL.PONE.0231968
3. Ghimire D., Jeong S., Lee J., Park S.H. Facial expression recognition based on local region specific features and support vector machines // Multimedia Tools and Applications. 2017. V. 76. № 6. P. 7803–7821. DOI:10.1007/S11042-016-3418-Y/TABLES/1
4. Ekman P., Friesen W.V. Facial action coding system: A technique for the measurement of facial movement. Palo Alto, CA: Consulting Psychologists Press, 1978. 527 p.
5. Siedlecka E., Denson T.F. Experimental methods for inducing basic emotions: A qualitative review // Emotion Review. 2019. V. 11. № 1. P. 87–97. DOI: 10.1177/1754073917749016
6. Fernández-Aguilar L., Navarro-Bravo B., Ricarte J., Ros L., Latorre J.M. How effective are films in inducing positive and negative emotional states? A meta-analysis // PLOS ONE. 2019. V. 14. № 11. P. 1–28. DOI: 10.1371/journal.pone.0225040
7. Diconne K., Kountouriotis G.K., Paltoglou A.E., Parker A., Hostler T.J. Presenting KAPODI — the searchable database of emotional stimuli sets // Emotion Review. 2022. V. 14. № 1. P. 84–95. DOI:10.1177/17540739211072803
8. Korolkova O.A., Lobodinskaya E.A. Development and validation of BEVEL dataset of natural dynamic facial expressions // Neurotechnologies. Chapter 11 / Eds. Shelepin Y., Alekseenko S., N. Nan Chu. St. Petersburg: VVM, 2021. P. 123–140.
9. Keltner D., Sauter D., Tracy J., Cowen A. Emotional expression: Advances in basic emotion theory // Journal of Nonverbal Behavior. 2019. V. 43. № 2. P. 133–160. DOI: 10.1007/S10919-019-00293-3
10. Lindquist K.A., Barrett L.F. Constructing emotion // Psychological Science. 2008. V. 19. № 9. P. 898–903. DOI: 10.1111/j.1467-9280.2008.02174.x
11. Izard C.E. The psychology of emotions. New York: Springer Science & Business Media, 1991. 452 p.

12. Baltrusaitis T., Zadeh A., Lim Y.C., Morency L.-P. OpenFace 2.0: Facial behavior analysis toolkit // 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018). Xi’an, China: IEEE. 15–19 May 2018. P. 59–66. DOI: 10.1109/FG.2018.00019
13. Kriegeskorte N. Representational similarity analysis — connecting the branches of systems neuroscience // Frontiers in Systems Neuroscience. 2008. V. 2. P. 1–28. DOI: 10.3389/neuro.06.004.2008
14. R Core Team. R: A language and environment for statistical computing // R Foundation for Statistical Computing. Vienna, Austria. 2020. URL: https://www.R-project.org/.
15. Benito B.M., Birks H.J.B. Distantia: an open source toolset to quantify dissimilarity between multivariate ecological time series // Ecography. 2020. V. 43. № 5. P. 660–667. DOI: 10.1111/ecog.04895
16. Brock G., Pihur V., Datta S., Datta S. clValid: An R package for cluster validation // Journal of Statistical Software. 2008. V. 25. № 4. P. 1–22. DOI:10.18637/jss.v025.i04