Validity Study of an Emotional Face-Database in Iranian Community
Abstract
Objective: To investigate the accuracy of facial emotion recognition in the Iranian community, a face database validated in this community is required. To do this, we conducted a validation study on the Radboud face database. The primary objective of this study was to evaluate the accuracy of recognizing emotions through faces in an Iranian sample and then to choose the pictures with high agreement in terms of detecting emotions.
Method: This is a cross sectional study recruiting a total number of 142 males and females aged between 20 and 50 years old (Mean ± SD of age 31.7 ± 7.07). The participants were instructed to detect the type of emotion of each face as well as its valence and arousal. The percentage of participants’ agreement on evaluating each picture was assessed. To evaluate the effect of different variables on participants’ accuracy, one way and repeated measure ANOVA analyses were also used.
Results: Emotional faces were recognized by around 84% of the participants. The highest accuracy belongs to happy (Mean ± SD of 98 ± 6.1%) and the lowest one to neutral (75 ± 18.06%) faces. The accuracy for detecting other emotions were as follows: sad (91 ± 8.7%), surprised (87 ± 10.64%), angry (77 ± 15.6%), and fearful (76 ± 15.26%). Additionally, we found no differences between male and female participants in terms of recognizing emotions. Then we selected the pictures with high agreement (above 85 percent) in labeling emotions among the participants.
Conclusion: The current study provided a valid emotional face database based on Iranian participants’ responses in terms of recognizing basic emotions. The selected pictures can be used in designing tasks to evaluate emotion recognition ability in clinical and nonclinical populations. It can also be used in designing applications to improve detecting emotion in clinical samples such as individuals with autism spectrum disorder.
2. Elfenbein HA, Ambady N. On the universality and cultural specificity of emotion recognition: a meta-analysis. Psychol Bull. 2002;128(2):203-35.
3. Ebner NC, He Y, Johnson MK. Age and emotion affect how we look at a face: visual scan patterns differ for own-age versus other-age emotional faces. Cogn Emot. 2011;25(6):983-97.
4. Leppänen JM. Neural and developmental bases of the ability to recognize social signals of emotions. Emotion Review. 2011;3(2):179-88.
5. Ekman P. Pictures of facial affect. (No Title). 1976.
6. Goeleven E, De Raedt R, Leyman L, Verschuere B. The Karolinska directed emotional faces: a validation study. Cogn Emot. 2008;22(6):1094-118.
7. Ebner NC, Riediger M, Lindenberger U. FACES--a database of facial expressions in young, middle-aged, and older women and men: development and validation. Behav Res Methods. 2010;42(1):351-62.
8. van der Schalk J, Hawk ST, Fischer AH, Doosje B. Moving faces, looking places: validation of the Amsterdam Dynamic Facial Expression Set (ADFES). Emotion. 2011;11(4):907-20.
9. Ma DS, Correll J, Wittenbrink B. The Chicago face database: A free stimulus set of faces and norming data. Behav Res Methods. 2015;47(4):1122-35.
10. O'Reilly H, Pigat D, Fridenson S, Berggren S, Tal S, Golan O, et al. The EU-Emotion Stimulus Set: A validation study. Behav Res Methods. 2016;48(2):567-76.
11. Yang T, Yang Z, Xu G, Gao D, Zhang Z, Wang H, et al. Tsinghua facial expression database - A database of facial expressions in Chinese young and older women and men: Development and validation. PLoS One. 2020;15(4):e0231304.
12. Langner O, Dotsch R, Bijlstra G, Wigboldus DH, Hawk ST, Van Knippenberg A. Presentation and validation of the Radboud Faces Database. Cogn Emot. 2010;24(8):1377-88.
13. Ekman P, Friesen WV. Facial action coding system. Environmental Psychology & Nonverbal Behavior. 1978.
14. Rotter NG, Rotter GS. Sex differences in the encoding and decoding of negative facial emotions. J Nonverbal Behav. 1988;12:139-48.
15. Thayer JF, Johnsen BH. Sex differences in judgement of facial affect: a multivariate analysis of recognition errors. Scand J Psychol. 2000;41(3):243-6.
16. Boloorizadeh P, Tojari F. Facial expression recognition: Age, gender and exposure duration impact. Procedia-Social and Behavioral Sciences. 2013;84:1369-75.
17. Becker DV, Kenrick DT, Neuberg SL, Blackwell KC, Smith DM. The confounded nature of angry men and happy women. J Pers Soc Psychol. 2007;92(2):179-90.
18. Bastanfard A, Nik MA, Dehshibi MM, editors. Iranian face database with age, pose and expression. 2007 International Conference on Machine Vision; 2007: IEEE.
19. Sharafi S, Tabar HA. Using Aeromagnetic Data and Geomorphic Evidence to Study the Hidden Fault Path in Khorram Abad Plain (West Iran). Geography and Development 17nd Year-No. 2019.
20. Mousavi SMH, Mirinezhad SY. Iranian kinect face database (IKFDB): a color-depth based face database collected by kinect v. 2 sensor. SN Applied Sciences. 2021;3(1):19.
21. Heydari F, Sheybani S, Yoonessi A. Iranian emotional face database: Acquisition and validation of a stimulus set of basic facial expressions. Behav Res Methods. 2023;55(1):143-50.
22. Ekman P, Friesen WV. Facial action coding system: Investigator's guide: Consulting Psychologists Press; 1978.
23. Bradley MM, Lang PJ. Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. J Behav Ther Exp Psychiatry. 1994;25(1):49-59.
24. Lang PJ. International affective picture system (IAPS): Technical manual and affective ratings. The Center for Research in Psychophysiology, University of Florida. 1995.
25. Dalrymple KA, Gomez J, Duchaine B. The Dartmouth Database of Children's Faces: acquisition and validation of a new face stimulus set. PLoS One. 2013;8(11):e79131.
26. Ebner NC, Johnson MK, Fischer H. Neural mechanisms of reading facial emotions in young and older adults. Front Psychol. 2012;3:223.
27. Ekman P, Friesen WV. A new pan-cultural facial expression of emotion. Motiv Emot. 1986;10:159-68.
28. Rosenberg EL, Ekman P. Conceptual and methodological issues in the judgment of facial expressions of emotion. Motiv Emot. 1995;19(2):111-38.
29. Matsumoto D, Ekman P. The relationship among expressions, labels, and descriptions of contempt. J Pers Soc Psychol. 2004;87(4):529-40.
30. Verpaalen IA, Bijsterbosch G, Mobach L, Bijlstra G, Rinck M, Klein AM. Validating the Radboud faces database from a child’s perspective. Cogn Emot. 2019;33(8):1531-47.
31. Mishra MV, Ray SB, Srinivasan N. Cross-cultural emotion recognition and evaluation of Radboud faces database with an Indian sample. PLoS One. 2018;13(10):e0203959.
32. Wingenbach TSH, Ashwin C, Brosnan M. Sex differences in facial emotion recognition across varying expression intensity levels from videos. PLoS One. 2018;13(1):e0190634.
33. Thompson AE, Voyer D. Sex differences in the ability to recognise non-verbal displays of emotion: a meta-analysis. Cogn Emot. 2014;28(7):1164-95.
Files | ||
Issue | Vol 19 No 4 (2024) | |
Section | Original Article(s) | |
DOI | https://doi.org/10.18502/ijps.v19i4.16552 | |
Keywords | ||
Facial Expression Recognition Facial Emotion Recognition Face Expressions Iranian People Validation |
Rights and permissions | |
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. |