Skip to main content
Log in

Detecting naturalistic expression of emotions using physiological signals while playing video games

  • Original Research
  • Published:
Journal of Ambient Intelligence and Humanized Computing Aims and scope Submit manuscript

Abstract

Affective gaming has been an active research field recently. This is due to the importance of the player’s emotions while playing computer games. Emotions can be detected from various modalities such as facial, voice, and physiological signals. In this study, we evaluate an XGBoost ensemble method and deep neural network for detecting naturalistic expressions of emotions of video game players using physiological signals. Physiological data was collected from twelve participants while playing PUBG mobile game. Both Discrete and dimensional emotion models were evaluated. We evaluated the performance of classification models using individual physiological channels and a fusion of these channels. A comparison between user-dependent, and user-independent is also provided. Our results indicated that the use of the dimensional valence and arousal model can provide more accurate accuracy than the discrete emotion model. The results also showed that ECG features and a fusion of features from all physiological channels provide the highest affect detection accuracy. Our deep neural network model based on user-dependent model achieved the highest accuracy with 77.92% and 78.58% of detecting valence, and arousal respectively using a fusion of features. The user-independent models were not feasible, presumably due to strong individual differences of physiological responses.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. https://biosignalsplux.com/products/kits/professional.html.

References

  • AlZoubi O, D’Mello SK, Calvo RA (2012) Detecting naturalistic expressions of nonbasic affect using physiological signals. IEEE Trans Affect Comput 3(3):298–310

    Article  Google Scholar 

  • AlZoubi O, Fossati D, D’Mello S, Calvo RA (2015) Affect detection from non-stationary physiological data using ensemble classifiers. Evol Syst 6(2):79–92

    Article  Google Scholar 

  • Anderson A, Hsiao T, Metsis V (2017) Classification of emotional arousal during multimedia exposure. In: Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments, pp 181–184

  • Andreassi JL (2013) Psychophysiology: Human behavior & physiological response. Psychology Press

    Book  Google Scholar 

  • Ayata D, Yaslan Y, Kamasak ME (2018) Emotion based music recommendation system using wearable physiological sensors. IEEE Trans Consum Electron 64(2):196–203

    Article  Google Scholar 

  • Bailenson JN, Pontikakis ED, Mauss IB, Gross JJ, Jabon ME, Hutcherson CA, Nass C, John O (2008) Real-time classification of evoked emotions using facial feature tracking and physiological responses. Int J Hum Comput Stud 66(5):303–317

    Article  Google Scholar 

  • Basu S, Jana N, Bag A, Mahadevappa M, Mukherjee J, Kumar S, Guha R (2015) Emotion recognition based on physiological signals using valence-arousal model. In: 2015 Third International Conference on Image Information Processing (ICIIP), IEEE, pp 50–55

  • Bradley MM, Lang PJ (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatry 25(1):49–59

    Article  Google Scholar 

  • Brady K, Gwon Y, Khorrami P, Godoy E, Campbell W, Dagli C, Huang TS (2016) Multi-modal audio, video and physiological sensor learning for continuous emotion prediction. In: Proceedings of the 6th International Workshop on Audio/Visual Emotion Challenge, pp 97–104

  • Busso C, Parthasarathy S, Burmania A, AbdelWahab M, Sadoughi N, Provost EM (2017) Msp-improv: an acted corpus of dyadic interactions to study emotion perception. IEEE Trans Affect Comput 8(1):67–80. https://doi.org/10.1109/TAFFC.2016.2515617

    Article  Google Scholar 

  • Caridakis G, Castellano G, Kessous L, Raouzaiou A, Malatesta L, Asteriadis S, Karpouzis K (2007) Multimodal emotion recognition from expressive faces, body gestures and speech. In: IFIP International Conference on Artificial Intelligence Applications and Innovations, Springer, pp 375–388

  • Chanel G (2009) Emotion assessment for affective computing based on brain and peripheral signals. PhD thesis, University of Geneva

  • Chanel G, Rebetez C, Bétrancourt M, Pun T (2011) Emotion assessment from physiological signals for adaptation of game difficulty. IEEE Trans Syst Man Cybernet-Part A 41(6):1052–1063

    Article  Google Scholar 

  • Chanel G, Lopes P (2020) User evaluation of affective dynamic difficulty adjustment based on physiological deep learning. In: International Conference on Human-Computer Interaction, Springer, pp 3–23

  • Chanel G, Rebetez C, Bétrancourt M, Pun T (2008) Boredom, engagement and anxiety as indicators for adaptation to difficulty in games. In: Proceedings of the 12th international conference on Entertainment and media in the ubiquitous era, pp 13–17

  • Chao L, Tao J, Yang M, Li Y, Wen Z (2015) Long short term memory recurrent neural network based multimodal dimensional emotion recognition. In: Proceedings of the 5th International Workshop on Audio/Visual Emotion Challenge, pp 65–72

  • Chen T, Guestrin C (2016) Xgboost: A scalable tree boosting system. In: Proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining, pp 785–794

  • Ciman M, Wac K (2018) Individuals’ stress assessment using human-smartphone interaction analysis. IEEE Trans Affect Comput 9(1):51–65

    Article  Google Scholar 

  • Craig S, Graesser A, Sullins J, Gholson B (2004) Affect and learning: an exploratory look into the role of affect in learning with autotutor. J Educ Media 29(3):241–250

    Article  Google Scholar 

  • Craig SD, D’Mello S, Witherspoon A, Graesser A (2008) Emote aloud during learning with autotutor: Applying the facial action coding system to cognitive-affective states during learning. Cogn Emot 22(5):777–788

    Article  Google Scholar 

  • Deng L, Yu D (2014) Deep learning: methods and applications. Found Trends Signal Process 7(3–4):197–387

    Article  MATH  Google Scholar 

  • Devan P, Khare N (2020) An efficient xgboost–dnn-based classification model for network intrusion detection system. Neural Computing and Applications pp 1–16

  • Domínguez-Jiménez JA, Campo-Landines KC, Martínez-Santos J, Delahoz EJ, Contreras-Ortiz S (2020) A machine learning model for emotion recognition from physiological signals. Biomed Signal Process Control 55:101646

    Article  Google Scholar 

  • Ekman P (1992) Are there basic emotions? Psychol Rev 99(3):550–553

    Article  Google Scholar 

  • Ekman PE, Davidson RJ (1994) The nature of emotion: fundamental questions. Oxford University Press, Oxford

    Google Scholar 

  • Gu Y, Wong KJ, Tan SL (2012) Analysis of physiological responses from multiple subjects for emotion recognition. 2012 IEEE 14th International Conference on e-Health Networking. Applications and Services (Healthcom), IEEE, pp 178–183

  • Hudlicka E (2008) Affective computing for game design. In: Proceedings of the 4th Intl. North American Conference on Intelligent Games and Simulation, McGill University Montreal, Canada, pp 5–12

  • Hussain MS, AlZoubi O, Calvo RA, D’Mello SK (2011) Affect detection from multichannel physiology during learning sessions with autotutor. In: International Conference on Artificial Intelligence in Education, Springer, pp 131–138

  • Kapur A, Kapur A, Virji-Babul N, Tzanetakis G, Driessen PF (2005) Gesture-based affective computing on motion capture data. In: International conference on affective computing and intelligent interaction, Springer, pp 1–7

  • Keren G, Kirschstein T, Marchi E, Ringeval F, Schuller B (2017) End-to-end learning for dimensional emotion recognition from physiological signals. In: 2017 IEEE International Conference on Multimedia and Expo (ICME), IEEE, pp 985–990

  • Kim J (2007) Bimodal emotion recognition using speech and physiological changes. Robust Speech Recogn Understand 265:280

    Google Scholar 

  • Kim J, André E (2008) Emotion recognition based on physiological changes in music listening. IEEE Trans Pattern Anal Mach Intell 30(12):2067–2083

    Article  Google Scholar 

  • Kim KH, Bang SW, Kim SR (2004) Emotion recognition system using short-term monitoring of physiological signals. Med Biol Eng Compu 42(3):419–427

    Article  Google Scholar 

  • Kim J, André E, Rehm M, Vogt T, Wagner J (2005) Integrating information from speech and physiological signals to achieve emotional sensitivity. In: Ninth European Conference on Speech Communication and Technology

  • Kim J, André E, Vogt T (2009) Towards user-independent classification of multimodal emotional signals. In: 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, IEEE, pp 1–7

  • Kolodyazhniy V, Kreibig SD, Gross JJ, Roth WT, Wilhelm FH (2011) An affective computing approach to physiological emotion specificity: toward subject-independent and stimulus-independent classification of film-induced emotions. Psychophysiology 48(7):908–922

    Article  Google Scholar 

  • Kortelainen J, Tiinanen S, Huang X, Li X, Laukka S, Pietikäinen M, Seppänen T (2012) Multimodal emotion recognition by combining physiological signals and facial expressions: a preliminary study. In: 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, IEEE, pp 5238–5241

  • Kreibig SD (2010) Autonomic nervous system activity in emotion: a review. Biol Psychol 84(3):394–421

    Article  Google Scholar 

  • Krohne HW (2003) Affective information processing. In: Scherer KR, Goldsmith HH, Davidson RJ (eds) Individual differences in emotional reactions and coping. Oxford University Press, Oxford, pp 698–725

    Google Scholar 

  • Kwon OW, Chan K, Hao J, Lee TW (2003) Emotion recognition by speech signals. In: Eighth European Conference on Speech Communication and Technology

  • Lang PJ, Bradley MM, Cuthbert BN et al (1997) International affective picture system (iaps): technical manual and affective ratings. NIMH Center Study Emot Atten 1:39–58

    Google Scholar 

  • Li C, Xu C, Feng Z (2016) Analysis of physiological for emotion recognition with the IRS model. Neurocomputing 178:103–111

    Article  Google Scholar 

  • Li L, Chen Jh (2006) Emotion recognition using physiological signals. In: International Conference on Artificial Reality and Telexistence, Springer, pp 437–446

  • Lichtenstein A, Oehme A, Kupschick S, Jürgensohn T (2008) Comparing two emotion models for deriving affective states from physiological data. In: Affect and emotion in human-computer interaction, Springer, pp 35–50

  • Maas AL, Hannun AY, Ng AY (2013) Rectifier nonlinearities improve neural network acoustic models. In: Proc. icml, vol 30, p 3

  • Maier M, Elsner D, Marouane C, Zehnle M, Fuchs C (2019) Deepflow: Detecting optimal user experience from physiological data using deep neural networks. In: AAMAS, pp 2108–2110

  • Mandryk RL, Atkins MS (2007) A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies. Int J Hum Comput Stud 65(4):329–347

    Article  Google Scholar 

  • Martínez HP, Garbarino M, Yannakakis GN (2011) Generic physiological features as predictors of player experience. In: International Conference on Affective Computing and Intelligent Interaction, Springer, pp 267–276

  • McDaniel B, D’Mello S, King B, Chipman P, Tapp K, Graesser A (2007) Facial features for affective state detection in learning environments. In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 29

  • Monkaresi H, Bosch N, Calvo RA, D’Mello SK (2016) Automated detection of engagement using video-based estimation of facial expressions and heart rate. IEEE Trans Affect Comput 8(1):15–28

    Article  Google Scholar 

  • Monkaresi H, Calvo RA, Hussain MS (2012a) Automatic natural expression recognition using head movement and skin color features. In: Proceedings of the International Working Conference on Advanced Visual Interfaces, pp 657–660

  • Monkaresi H, Hussain MS, Calvo RA (2012b) Classification of affects using head movement, skin color features and physiological signals. In: 2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC), IEEE, pp 2664–2669

  • Monkaresi H, Hussain MS, Calvo RA (2012c) A dynamic approach for detecting naturalistic affective states from facial videos during hci. In: Australasian Joint Conference on Artificial Intelligence, Springer, pp 170–181

  • Nicholson J, Takahashi K, Nakatsu R (2000) Emotion recognition in speech using neural networks. Neural Comput Appl 9(4):290–296

    Article  MATH  Google Scholar 

  • Nogueira PA, Rodrigues R, Oliveira E (2013) Real-time psychophysiological emotional state estimation in digital gameplay scenarios. In: International Conference on Engineering Applications of Neural Networks, Springer, pp 243–252

  • Park CY, Cha N, Kang S, Kim A, Khandoker AH, Hadjileontiadis L, Oh A, Jeong Y, Lee U (2020) K-emocon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations. Sci Data 7(1):1–16

    Article  Google Scholar 

  • Peter C, Ebert E, Beikirch H (2009) Physiological sensing for affective computing. In: Affective Information Processing, Springer, pp 293–310

  • Piana S, Stagliano A, Odone F, Verri A, Camurri A (2014) Real-time automatic emotion recognition from body gestures. arXiv preprint arXiv:14025047

  • Picard RW (1999) Affective computing for hci. In: HCI (1), Citeseer, pp 829–833

  • Picard RW (2000) Affective computing. MIT press, Cambridge

    Book  Google Scholar 

  • Picard RW, Vyzas E, Healey J (2001) Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans Pattern Anal Mach Intell 23(10):1175–1191

    Article  Google Scholar 

  • Plutchik R (2001) The nature of emotions: Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. Am Sci 89(4):344–350

    Article  Google Scholar 

  • Rani P, Liu C, Sarkar N, Vanman E (2006) An empirical study of machine learning techniques for affect recognition in human-robot interaction. Pattern Anal Appl 9(1):58–69

    Article  Google Scholar 

  • Rigas G, Katsis CD, Ganiatsas G, Fotiadis DI (2007) A user independent, biosignal based, emotion recognition method. In: International Conference on User Modeling, Springer, pp 314–318

  • Rincon JA, Costa Â, Novais P, Julian V, Carrascosa C (2016) Using non-invasive wearables for detecting emotions with intelligent agents. In: International Joint Conference SOCO’16-CISIS’16-ICEUTE’16, Springer, pp 73–84

  • Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161

    Article  Google Scholar 

  • Saha S, Datta S, Konar A, Janarthanan R (2014) A study on emotion recognition from body gestures using kinect sensor. In: 2014 International Conference on Communication and Signal Processing, IEEE, pp 056–060

  • Santamaria-Granados L, Munoz-Organero M, Ramirez-Gonzalez G, Abdulhay E, Arunkumar N (2018) Using deep convolutional neural network for emotion detection on a physiological signals dataset (amigos). IEEE Access 7:57–67

    Article  Google Scholar 

  • Shu L, Yu Y, Chen W, Hua H, Li Q, Jin J, Xu X (2020) Wearable emotion recognition using heart rate data from a smart bracelet. Sensors 20(3):718

    Article  Google Scholar 

  • Siegert I, Böck R, Wendemuth A (2014) Inter-rater reliability for emotion annotation in human-computer interaction: comparison and methodological improvements. J Multimodal User Interface 8(1):17–28

    Article  Google Scholar 

  • Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958

    MATH  Google Scholar 

  • Szwoch M, Pienia̧żek P (2015) Facial emotion recognition using depth data. In: 2015 8th International Conference on Human System Interaction (HSI), IEEE, pp 271–277

  • Tao J, Tan T (2005) Affective computing: a review. In: International Conference on Affective computing and intelligent interaction, Springer, pp 981–995

  • Tognetti S, Garbarino M, Bonanno AT, Matteucci M, Bonarini A (2010a) Enjoyment recognition from physiological data in a car racing game. In: Proceedings of the 3rd international workshop on Affective interaction in natural environments, pp 3–8

  • Tognetti S, Garbarino M, Bonarini A, Matteucci M (2010b) Modeling enjoyment preference from physiological responses in a car racing game. In: Proceedings of the 2010 IEEE Conference on Computational Intelligence and Games, IEEE, pp 321–328

  • Torres CA, Orozco ÁA, Álvarez MA (2013) Feature selection for multimodal emotion recognition in the arousal-valence space. In: 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), IEEE, pp 4330–4333

  • Vachiratamporn V, Legaspi R, Moriyama K, Numao M (2013) Towards the design of affective survival horror games: An investigation on player affect. In: 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, IEEE, pp 576–581

  • Wagner J, Kim J, André E (2005) From physiological signals to emotions: implementing and comparing selected methods for feature extraction and classification. In: 2005 IEEE international conference on multimedia and expo, IEEE, pp 940–943

  • Yang B, Lugger M (2010) Emotion recognition from speech signals using new harmony features. Signal Process 90(5):1415–1423

    Article  MATH  Google Scholar 

  • Yang W, Rifqi M, Marsala C, Pinna A (2018) Physiological-based emotion detection and recognition in a video game context. In: 2018 International Joint Conference on Neural Networks (IJCNN), IEEE, pp 1–8

  • Zhong B, Qin Z, Yang S, Chen J, Mudrick N, Taub M, Azevedo R, Lobaton E (2017) Emotion recognition with facial expressions and physiological signals. In: 2017 IEEE Symposium Series on Computational Intelligence (SSCI), IEEE, pp 1–8

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Omar AlZoubi.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

AlZoubi, O., AlMakhadmeh, B., Bani Yassein, M. et al. Detecting naturalistic expression of emotions using physiological signals while playing video games. J Ambient Intell Human Comput 14, 1133–1146 (2023). https://doi.org/10.1007/s12652-021-03367-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12652-021-03367-7

Keywords

Navigation