EEG-based approach for recognizing human social emotion perception
Introduction
It is undoubted that emotion is crucial component during human–human interaction and human–machine interaction in our everyday life [1]. Emotion is also relevant to some mental diseases and detrimental habits such as internet addiction, tristimania [2], anxiety and social phobia [3]. Although a great deal of efforts have been made by researchers from diverse disciplines (e.g., neuroscience, psychology, and computer science) to investigate emotion perception, the knowledge we currently acquired is still limited [4], [5]. Most of the prior studies explored emotion perception in a scenario of one person [6], [7], [8], [9], rather than multiple persons who interact each other. In our daily life, people usually perceive emotion during interacting with others [10]. For instance, how you feel yourself and other people during the social interaction [11], [12]. Therefore, it is better to investigate emotion perception during social interaction.
In recent years, hyper-scanning, a technique for measuring brain activities simultaneously from two or more persons, has been utilized to explore brain-to-brain interactions when two or more persons engage in a task [13]. The hyper-scanning could be categorized as fNIRS (functional near infrared spectroscopy) hyper-scanning [14], fMRI (functional magnetic resonance imaging) hyper-scanning [15] and EEG hyper-scanning [16] according to the signal recorded in the experiment. Because EEG is of high temporal resolution and ease of use with an advantage of low cost [17], we employed EEG hyper-scanning in our study. Previous studies revealed that the frontal region was dominantly involved in the emotion recognition, especially facial expression recognition [18], [19], [20]. This hints us to focus on the frontal region for emotion perception exploration in the context of multiple persons. The concentration on a particular region could also benefit the following step of emotion classification as less amount of data are required for processing.
In this study, we designed an EEG hyper-scanning experiment to explore emotion perception in the scenario of two-person interaction. The characteristics of brain activities were first explored to reveal neural mechanisms related to emotion perception. Then, a deep learning model was utilized to classify different emotion categories based on the findings in the data analysis. The framework of our study was depicted in Fig. 1. The main contributions of this paper are (1) designing an EEG hyper-scanning experiment to explore emotion perception in the scenario of two-person interaction; (2) revealing EEG characteristics associated with emotion perception; (3) evaluating the effectiveness of intra-brain and inter-brain phase synchronization features; (4) recognizing emotions based on EEG signal only using twelve electrodes on the frontal region, which could facilitate the development of portable emotion recognition system. The remainder of the paper is organized as follows. The relevant work is introduced in Section 2, which is followed by the description of experiment design and setup in Section 3. Next, the methodological descriptions of the preprocessing, ERP, synchronization feature extraction, convolutional neural networks are given in Section 4, which is followed by the results in Section 5. Finally, conclusions are drawn in Section 6.
Section snippets
Hyper-scanning experiments
Hyper-scanning is a technique that enables to measure signals simultaneously from two or more persons. Montague et al. presented an experiment of deception game played by a pair of participants, which is the first fMRI-based hyper-scanning experiment [15]. About ten years later, the first NIRS-based hyper-scanning was reported and NIRS signals were simultaneously collected from two participants engaging in a cooperation task [21]. Babiloni et al. [22] designed the first EEG-based hyper-scanning
Experiment setup
The objective of our study aimed to investigate neural mechanisms of social emotion perception when two persons engaged in the same task. Since there was a potential cultural bias for Chinese participants using pictures from the International Affective Picture System (IAPS) [39], the native Chinese Affective Picture System (CAPS) has been used in the studies with Chinese participants [40], [41]. Similarly, we used facial pictures from the native CAPS in this study.
Three hundred pictures were
Preprocessing
In the step of signal preprocessing, EEG data were band-pass filtered at the cut-off frequencies of 0.5 Hz and 85 Hz, which was followed by common average reference (CAR) that the magnitude mean averaged across all electrodes was subtracted from the magnitudes of each electrode [45]. Independent Component Analysis (ICA) was subsequently used to obtain independent components, which was implemented in the EEGLAB [46]. The components representing artifacts were detected by using the MARA [47] (a
Event related potential
Fig. 6 shows the ERPs of four emotion categories for the selected twelve channels located on frontal region. It can be seen that the ERPs for anger on the channels FP1, FPZ, FP2, AF3, and AF4 and for disgust on the channels F1, FZ and F4 were larger than the ERPs for happy on those channels. Statistical results were listed in Table 1. P-values are original significance level while Q-values are significance level after multiple comparison correction (using FDR). There were statistically
Conclusion
In this paper, we explored social emotion perception and emotion classification with the channels located on the frontal region. We focus on frontal lobe because (1) it was reported that the frontal lobe plays an important role in emotion perception; (2) from the perspective of practical application of emotion recognition, less number of channels could reduce computational complexity and facilitate the setup. In this study, we investigated social emotion from the aspects (1) time-locked
Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Acknowledgments
The work presented in this paper was supported by the National Natural Science Foundation of China (No. 61633010, 61806149,61673322), and the National Key Basic Research Program of China (No. 2013CB329502). This research was also partially supported by the Ministry of Education and Science of the Russian Federation (grant 14.756.31.0001) and the Polish National Science Center, Poland (grant 2016/20/W/N24/00354).
References (62)
- et al.
Neurobiology of emotion perception I: the neural basis of normal emotion perception
Biol. Psychiatry
(2003) - et al.
Brain-to-brain synchrony tracks real-world dynamic group interactions in the classroom
Curr. Biol.
(2017) - et al.
NIRS-based hyperscanning reveals increased interpersonal coherence in superior frontal cortex during cooperation
Neuroimage
(2012) - et al.
Role of the right inferior frontal gyrus in turn-based cooperation and competition: a near-infrared spectroscopy study
Brain Cogn.
(2015) - et al.
Interindividual synchronization of brain activity during live verbal communication
Behav. Brain Res.
(2014) - et al.
Social neuroscience and hyperscanning techniques: past, present and future
Neurosci. Biobehav. Rev.
(2014) - et al.
From social behaviour to brain synchronization: review and perspectives in hyperscanning
Irbm
(2011) - et al.
EEG based emotion classification mechanism in BCI
Procedia Comput. Sci.
(2018) - et al.
EEG signal classification using PCA, ICA, LDA and support vector machines
Expert Syst. Appl.
(2010) - et al.
Automated EEG-based screening of depression using deep convolutional neural network
Comput. Methods Programs Biomed.
(2018)
Automatic processing of valence differences in emotionally negative stimuli: evidence from an ERP study
Neurosci. Lett.
EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis
J. Neurosci. Methods
Deep convolutional neural network for the automated detection and diagnosis of seizure using EEG signals
Comput. Biol. Med.
Toward an affect-sensitive multimodal human-computer interaction
Proc. IEEE
Historical aspects of mood and its disorders in young people
Functional neuroimaging of anxiety: a meta-analysis of emotional processing in PTSD, social anxiety disorder, and specific phobia
Am. J. Psychiatry
Measures of emotion: A review
Cogn. Emot.
EmotionMeter: A multimodal framework for recognizing human emotions
IEEE Trans. Cybern.
Facial emotion perception in schizophrenia: a meta-analytic review
Schizophr. Bull.
Embodiment in attitudes, social perception, and emotion
Pers. Soc. Psychol. Rev.
Emotion recognition in human-computer interaction
IEEE Signal Process. Mag.
L. Feuerbach und der Ausgang der klassischen deutschen Philosophie
Riv. Filos.
Sociology: Understanding and changing the social world
How emotions regulate social life: The emotions as social information (EASI) model
Curr. Dir. Psychol. Sci.
Functional near infrared optical imaging in cognitive neuroscience: an introductory review
J. Near Infrared Spectrosc.
Hyperscanning: Simultaneous fMRI During Linked Social Interactions
EEG hyperscanning study of inter-brain synchrony during cooperative and competitive interaction
Telomere length as a predictor of emotional processing in the brain
Hum. Brain Mapp.
Ventromedial frontal lobe damage affects interpretation, not exploration, of emotional facial expressions
Cortex
Emotional recognition from dynamic facial, vocal and musical expressions following traumatic brain injury
Brain Inj.
Cited by (16)
Light-weight residual convolution-based capsule network for EEG emotion recognition
2024, Advanced Engineering InformaticsEffect of transcranial photobiomodulation on electrophysiological activity of brain in healthy individuals: A scoping review
2023, Journal of Clinical NeuroscienceEEG-based emotion recognition in an immersive virtual reality environment: From local activity to brain network features
2022, Biomedical Signal Processing and ControlCitation Excerpt :We obtained consistent results from the Spearman correlation analysis (Fig. 8) and random-forest-based feature selection (Fig. 12) that the electrodes located in the frontal and occipital regions played a more important role in emotion recognition. Previous studies have shown that the frontal area dominantly contributed to emotion perception and recognition [76–78]. For example, Heberlein et al. [77] reported that the ventromedial frontal cortex plays a crucial role in facial emotion recognition.
Review of the emotional feature extraction and classification using EEG signals
2021, Cognitive RoboticsMultimodal Human–Robot Interaction for Human-Centric Smart Manufacturing: A Survey
2024, Advanced Intelligent SystemsSocial and Non-social Reward Learning Contexts for Detection of Major Depressive Disorder Using EEG: A Machine Learning Approach
2023, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)