Informatica logo


Login Register

  1. Home
  2. Issues
  3. Volume 31, Issue 2 (2020)
  4. Kriging Predictor for Facial Emotion Rec ...

Informatica

Information Submit your article For Referees Help ATTENTION!
  • Article info
  • Full article
  • Cited by
  • More
    Article info Full article Cited by

Kriging Predictor for Facial Emotion Recognition Using Numerical Proximities of Human Emotions
Volume 31, Issue 2 (2020), pp. 249–275
Rasa Karbauskaitė   Leonidas Sakalauskas   Gintautas Dzemyda  

Authors

 
Placeholder
https://doi.org/10.15388/20-INFOR419
Pub. online: 2 June 2020      Type: Research Article      Open accessOpen Access

Received
1 January 2020
Accepted
1 May 2020
Published
2 June 2020

Abstract

Emotion recognition from facial expressions has gained much interest over the last few decades. In the literature, the common approach, used for facial emotion recognition (FER), consists of these steps: image pre-processing, face detection, facial feature extraction, and facial expression classification (recognition). We have developed a method for FER that is absolutely different from this common approach. Our method is based on the dimensional model of emotions as well as on using the kriging predictor of Fractional Brownian Vector Field. The classification problem, related to the recognition of facial emotions, is formulated and solved. The relationship of different emotions is estimated by expert psychologists by putting different emotions as the points on the plane. The goal is to get an estimate of a new picture emotion on the plane by kriging and determine which emotion, identified by psychologists, is the closest one. Seven basic emotions (Joy, Sadness, Surprise, Disgust, Anger, Fear, and Neutral) have been chosen. The accuracy of classification into seven classes has been obtained approximately 50%, if we make a decision on the basis of the closest basic emotion. It has been ascertained that the kriging predictor is suitable for facial emotion recognition in the case of small sets of pictures. More sophisticated classification strategies may increase the accuracy, when grouping of the basic emotions is applied.

References

 
Adolphs, R., Anderson, D.J. (2018). The Neuroscience of Emotion: A New Synthesis. Princeton University Press.
 
Bhardwaj, N., Dixit, M. (2016). A review: facial expression detection with its techniques and application. International Journal of Signal Processing, Image Processing and Pattern Recognition, 9(6), 149–158.
 
Bradley, M.M., Greenwald, M.K., Petry, M.C., Lang, P.J. (1992). Remembering pictures: pleasure and arousal in memory. Journal of Experimental Psychology: Learning, Memory and Cognition, 18(2), 379–390.
 
Calvo, R.A., Kim, S.M. (2013). Emotions in text: dimensional and categorical models. Computational Intelligence, 29(3), 527–543.
 
Cambria, E., Livingstone, A., Hussain, A. (2012). The hourglass of emotions. In: Esposito, A., Viciarelli, A., Hoffmann, R., Muller, V. (Eds.), Cognitive Behavioural Systems, Lecture Notes in Computers Science, Vol. 7403. Springer, Berlin, Heidelberg, pp. 144–157.
 
Deshmukh, R.S., Jagtap, V., Paygude, S. (2017). Facial emotion recognition system through machine learning approach. In: 2017 International Conference on Intelligent Computing and Control Systems (ICICCS), pp. 272–277.
 
Dhall, A., Ramana Murthy, O.V., Goecke, R., Joshi, J., Gedeon, T. (2015). Video and image based emotion recognition challenges in the wild: EmotiW 2015. In: Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, pp. 423–426.
 
D’mello, S., Graesser, A. (2007). Mind and body: dialogue and posture for affect detection in learning environments. Frontiers in Artificial Intelligence and Applications, 158, 161–168.
 
Dzemyda, G. (2001). Visualization of a set of parameters characterized by their correlation matrix. Computational Statistics & Data Analysis, 36(1), 15–30.
 
Eerola, T., Vuoskoski, J.K. (2011). A comparison of the discrete and dimensional models of emotion in music. Psychology of Music, 39(1), 18–49.
 
Ekman, P., Friesen, W.V. (1978). Manual for the Facial Action Code. Consulting Psychologist Press, Palo Alto, CA,
 
Ekman, P. (1992). An argument for basic emotions. Cognition and Emotion, 6(3), 169–200.
 
Ekman, P. (1999). Basic emotions. In: Dalgleish, T., Powers, M.J. (Eds.), Handbook of Cognition and Emotion. Wiley, Hoboken, pp. 4–5.
 
Farnsworth, P.R. (1954). A study of the Hevner adjective list. The Journal of Aesthetics and Art Criticism, 13(1), 97–103.
 
Ferdig, R.E., Mishra, P. (2004). Emotional responses to computers: experiences in unfairness, anger, and spite. Journal of Educational Multimedia and Hypermedia, 13(2), 143–161.
 
Filella, G., Cabello, E., Pérez-Escoda, N., Ros-Morente, A. (2016). Evaluation of the emotional education program “Happy 8-12” for the assertive resolution of conflicts among peers. Electronic Journal of Research in Educational Psychology, 14(3), 582–601.
 
Fontaine, J.R.J., Scherer, K.R., Roesch, E.B., Ellsworth, P.C. (2007). The world of emotions is not two-dimensional. Psychological Science, 18(12), 1050–1057.
 
Gan, Y., Chen, J., Xu, L. (2019). Facial expression recognition boosted by soft label with a diverse ensemble. Pattern Recognition Letters, 125, 105–112.
 
Gobron, S., Ahn, J., Paltoglou, G., Thelwall, M., Thalmann, D. (2010). From sentence to emotion: a real-time three-dimensional graphics metaphor of emotions extracted from text. Visual Computer, 26, 505–519.
 
Goodfellow, I.J., Erhan, D., Carrier, P.L., Courville, A., Mirza, M., Hamner, B., Cukierski, W., Tang, Y., Thaler, D., Lee, D.-H., Zhou, Y., Ramaiah, C., Feng, F., Li, R., Wang, X., Athanasakis, D., Shawe-Taylor, J., Milakov, M., Park, J., Ionescu, R., Popescu, M., Grozea, C., Bergstra, J., Xie, J., Romaszko, L., Xu, B., Chuang, Z., Bengio, Y. (2013). Challenges in representation learning: a report on three machine learning contests. In: International Conference on Neural Information Processing. Springer, pp. 117–124.
 
Grekow, J. (2018). From Content-Based Music Emotion Recognition to Emotion Maps of Musical Pieces. Studies in Computational Intelligence, Vol. 747. Springer, Warsaw, Poland.
 
Hevner, K. (1936). Experimental studies of the elements of expression in music. American Journal of Psychology, 48(2), 246–268.
 
Hu, X., Downie, J.S. (2007). Exploring mood metadata: relationships with genre, artist and usage metadata. In: Proceedings of the 8th International Conferenceon Music Information Retrieval, Vienna, Austria, pp. 67–72.
 
Hu, X., Downie, J.S., Laurier, C., Bay, M., Ehmann, A.F. (2008). The 2007 MIREX audio mood classification task: lessons learned. In: ISMIR 2008, 9th International Conference on Music Information Retrieval, Philadelphia, PA, USA, pp. 462–467.
 
Yancong, Y., Ruidong, P. (2011). Image analysis based on fractional Brownian motion dimension. In: 2011 IEEE International Conference on Computer Science and Automation Engineering, Shanghai, Vol. 2, pp. 15–19.
 
Johnson-Laird, P.N., Oatley, K. (1989). The language of emotions: an analysis of a semantic field. Cognition and Emotion, 3(2), 81–123.
 
Jones, D.R. (2001). A taxonomy of global optimization methods based on response surfaces. Journal of Global Optimization, 21, 345–383.
 
Ko, B.C. (2018). A brief review of facial emotion recognition based on visual information. Sensors, 18 401. https://doi.org/10.3390/s18020401.
 
Li, S., Deng, W., Du, J. (2017). Reliable crowdsourcing and deep locality-preserving learning for expression recognition in the wild. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 2584–2593.
 
Lövheim, H. (2011). A new three-dimensional model for emotions and monoamine neurotransmitters. Medical Hypotheses, 78(2), 341–348.
 
Mariappan, M.B., Suk, M., Prabhakaran, B. (2012). Facefetch: a user emotion driven multimedia content recommendation system based on facial expression recognition. In: Proceedings of the 2012 IEEE International Symposium on Multimedia, pp. 84–87.
 
Maupome, G., Isyutina, O. (2013). Dental students’ and faculty members’ concepts and emotions associated with a caries risk assessment program. Journal of Dental Education, 77(11), 1477–1487.
 
Mehrabian, A. (1980). Basic dimensions for a general psychological theory. Oelgeschlager, Gunn & Hain.
 
Mehrabian, A. (1996). Pleasure-arousal-dominance: a general framework for describing and measuring individual differences in temperament. Current Psychology, 14(4), 261–292.
 
Mehrabian, A., Russell, J.A. (1974). An Approach to Environmental Psychology. MIT Press, Cambridge.
 
Metcalfe, D., McKenzie, K., McCarty, K., Pollet, T.V. (2019). Emotion recognition from body movement and gesture in children with Autism Spectrum Disorder is improved by situational cues. Research in Developmental Disabilities, 86, 1–10. https://doi.org/10.1016/j.ridd.
 
Nonis, F., Dagnes, N., Marcolin, F., Vezzetti, E. (2019). 3D approaches and challenges in facial expression recognition algorithms — a literature review. Applied Sciences, 9(3904).
 
Olszanowski, M., Pochwatko, G., Kukliński, K., Ścibor-Rylski, M., Lewinski, P., Ohme, R. (2015). Warsaw set of emotional facial expression pictures: a validation study of facial display photographs. Frontiers in Psychology, 5(1516).
 
Paltoglou, G., Thelwall, M. (2013). Seeing stars of valence and arousal in blog posts. IEEE Transactions on Affective Computing, 4(1), 116–123.
 
Plutchik, R. (2001). The nature of emotions. American Scientist, 89(4), 344–350.
 
Plutchik, R., Kellerman, K. (1980). Emotion: Theory, Research, and Experience. Academic Press, London.
 
Pozniak, N., Sakalauskas, L. (2017). Fractional Euclidean distance matrices extrapolator for scattered data. Journal of Young Scientists, 47(2), 56–61.
 
Pozniak, N., Sakalauskas, L., Saltyte, L. (2019). Kriging model with fractional Euclidean distance matrices. Informatica, 30(2), 367–390.
 
Purificación, C., Pablo, F.B. (2019). Cognitive control and emotional intelligence: effect of the emotional content of the task. Brief reports. Frontiers in Psychology, 10(195). https://doi.org/10.3389/fpsyg.2019.00195.
 
Ramalingam, V.V., Pandian, A., Jaiswal, A., Bhatia, N. (2018). Emotion detection from text. Journal of Physics: Conference Series, 1000. https://doi.org/10.1088/1742-6596/1000/1/012027.
 
Ranade, A.G., Patel, M., Magare, A. (2018). Emotion model for artificial intelligence and their applications. In: 2018 Fifth International Conference on Parallel, Distributed and Grid Computing (PDGC), Solan Himachal Pradesh, India, pp. 335–339.
 
Revina, I.M., Emmanuel, W.R.S. (2018). A survey on human face expression recognition techniques. Journal of King Saud University – Computer and Information Sciences, 1(8). https://doi.org/10.1016/j.jksuci.2018.09.002.
 
Rubin, D.C., Talarico, J.M. (2009). A comparison of dimensional models of emotion: evidence from emotions, prototypical events, autobiographical memories, and words. Memory, 17(8), 802–808.
 
Russell, J.A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161–1178.
 
Sailunaz, K., Dhaliwal, M., Rokne, J., Alhajj, R. (2018). Emotion detection from text and speech: a survey. Social Network Analysis and Mining, 8(28), https://doi.org/10.1007/s13278-018-0505-2.
 
Scherer, K.R. (2005). What are emotions? And how can they be measured? Social Science Information, 44(4), 695–729.
 
Schubert, E. (2003). Update of the Hevner adjective checklist. Perceptual and Motor Skills, 96(4), 1117–1122.
 
Shao, J., Qian, Y. (2019). Three convolutional neural network models for facial expression recognition in the wild. Neurocomputing, 355, 82–92.
 
Sharma, G., Singh, L., Gautam, S. (2019). Automatic facial expression recognition using combined geometric features. 3D Research, 10, 14. https://doi.org/10.1007/s13319-019-0224-0.
 
Shivhare, S.N., Khethawat, S. (2012). Emotion detection from text. Computer Science and Information Technology, 2, 371–377. https://doi.org/10.5121/csit.2012.2237.
 
Sreeja, P.S., Mahalakshmi, G.S. (2017). Emotion models: a review. International Journal of Control Theory and Applications, 10(8), 651–657.
 
Stathopoulou, I.O., Tsihrintzis, G.A. (2011). Emotion recognition from body movements and gestures. In: Tsihrintzis, G.A., Virvou, M., Jain, L.C., Howlet, T.R.J. (Eds.), Intelligent Interactive Multimedia Systems and Services. Smart Innovation, Systems and Technologies, Vol. 11. Springer, Berlin, Heidelberg.
 
Su, M.H., Wu, C.H., Huang, K.Y., Hong, Q.B., Wang, H.M. (2017). Exploring microscopic fluctuation of facial expression for mood disorder classification. In: Proceedings of the International Conference on Orange Technologies, pp. 65–69. https://doi.org/10.1109/ICOT.2017.8336090.
 
Tamulevičius, G., Karbauskaitė, R., Dzemyda, G. (2017). Selection of fractal dimension features for speech emotion classification. In: 2017 Open Conference of Electrical Electronic and Information Sciences (eStream). IEEE, New York, pp. 1–4.
 
Tamulevičius, G., Karbauskaitė, R., Dzemyda, G. (2019). Speech emotion classification using fractal dimension-based features. Nonlinear Analysis: Modelling and Control, 24(5), 679–695.
 
Tan, Z., Atto, A.M., Alata, O., Moreaud, M. (2015). ARFBF model for non stationary random fields and application in HRTEM images. In: 2015 IEEE International Conference on Image Processing (ICIP), Quebec City, QC, pp. 2651–2655.
 
Thayer, R.E. (1989). The Biopsychology of Mood and Arousal. Oxford University Press, New York, NY, US.
 
Vorontsova, T.A., Labunskaya, V.A. (2020). Emotional attitude to own appearance and appearance of the spouse: analysis of relationships with the relationship of spouses to themselves, others, and the world. Behavioral Sciences, 10(2), 44.
 
Wang, R., Fang, B. (2008). Affective computing and biometrics based HCI surveillance system. In: Proceedings of the International Symposium on Information Science and Engineering, pp. 192–195.
 
Watson, D., Tellegen, A. (1985). Toward a consensual structure of mood. Psychological Bulletin, 98(2), 219–235.
 
Watson, D., Wiese, D., Vaidya, J., Tellegen, A. (1999). The two general activation systems of affect: structural findings, evolutionary considerations, and psychobiological evidence. Journal of Personality and Social Psychology, 76, 820–838.
 
Weiguo, W., Qingmei, M., Yu, W. (2004). Development of the humanoid head portrait robot system with flexible face and expression. In: Proceedings of the 2004 IEEE International Conference on Robotics and Biomimetics, pp. 757–762.
 
Whissell, C. (1989). The dictionary of affect in language. Emotion: Theory, Research, and Experience, 4, 113–131.
 
Wilson, G., Dobrev, D., Brewster, S.A. (2016). Hot under the collar: mapping thermal feedback to dimensional models of emotion. In: CHI 2016, San Jose, CA, USA, pp. 4838–4849.
 
Wundt, W.M. (1897). Outlines of Psychology. Leipzig, W. Engelmann, New York, G.E. Stechert.

Biographies

Karbauskaitė Rasa
rasa.karbauskaite@mif.vu.lt

R. Karbauskaitė is a researcher of Cognitive Computing Group at Institute of Data Science and Digital Technologies of Vilnius University. She received a bachelor’s degree in mathematics and informatics (2003) and a master’s degree in informatics (2005) from Vilnius Pedagogical University, PhD in informatics from Vytautas Magnus University and Institute of Mathematics and Informatics (2010). Her research interests include multidimensional data visualization, estimation of the visualization quality, dimensionality reduction, estimation of the intrinsic dimensionality of high-dimensional data, facial emotion recognition, and data clustering.

Sakalauskas Leonidas
leonidas.sakalauskas@mif.vu.lt

L. Sakalauskas, habil. dr. (2000), prof. (2006), research interests: data mining, operations research, stochastic optimization, statistical modelling. He developed the stochastic optimization approach by Monte Carlo series and studied its convergence, developed the theory of vectorial fractal Brownian fields with implementation for surogate modelling, developed a concept of modelling and simulation of social-behavioural phenomena, etc. He has written more than 250 scientific publications, 70 of which are referenced in Clarivate Analytics DB, has supervised 15 PhD thesis, has organised more than 20 scientific conferences.

Dzemyda Gintautas
gintautas.dzemyda@mif.vu.lt

G. Dzemyda received the doctoral degree in technical sciences (PhD) in 1984, and he received the degree of Doctor Habilius in 1997 from Kaunas University of Technology. He was conferred the title of professor at Kaunas University of Technology (1998) and Vilnius University (2018). Recent employment is at Vilnius University, Institute of Data Science and Digital Technologies, as the director of the Institute, the head of Cognitive Computing Group, Professor and Principal Researcher. The research interests cover visualization of multidimensional data, optimization theory and applications, data mining, multiple criteria decision support, neural networks, image analysis. He is the author of more than 260 scientific publications, two monographs, five textbooks.


Full article Cited by PDF XML
Full article Cited by PDF XML

Copyright
© 2020 Vilnius University
by logo by logo
Open access article under the CC BY license.

Keywords
facial emotion recognition Fractional Brownian Vector Field kriging predictor dimensional models of emotions classifier

Metrics
since January 2020
2095

Article info
views

2489

Full article
views

1290

PDF
downloads

303

XML
downloads

Export citation

Copy and paste formatted citation
Placeholder

Download citation in file


Share


RSS

INFORMATICA

  • Online ISSN: 1822-8844
  • Print ISSN: 0868-4952
  • Copyright © 2023 Vilnius University

About

  • About journal

For contributors

  • OA Policy
  • Submit your article
  • Instructions for Referees
    •  

    •  

Contact us

  • Institute of Data Science and Digital Technologies
  • Vilnius University

    Akademijos St. 4

    08412 Vilnius, Lithuania

    Phone: (+370 5) 2109 338

    E-mail: informatica@mii.vu.lt

    https://informatica.vu.lt/journal/INFORMATICA
Powered by PubliMill  •  Privacy policy