EEG-based approach for recognizing human social emotion perception

https://doi.org/10.1016/j.aei.2020.101191Get rights and content

Abstract

Social emotion perception plays an important role in our daily social interactions and is involved in the treatments for mental disorders. Hyper-scanning technique enables to measure brain activities simultaneously from two or more persons, which was employed in this study to explore social emotion perception. We analyzed the recorded electroencephalogram (EEG) to explore emotion perception in terms of event related potential (ERP) and phase synchronization, and classified emotion categories based on convolutional neural network (CNN). The results showed that (1) ERP was significantly different among four emotion categories (i.e., anger, disgust, neutral, and happy), but there was no significant difference for ERP in the comparison of rating orders (the order of rating actions of the paired participants); (2) the intra-brain phase lag index (PLI) was higher than the inter-brain PLI but its number of connections exhibiting significant difference was less in all typical frequency bands (from delta to gamma); (3) the emotion classification accuracy of inter-PLI-Conv outperformed that of intra-PLI-Conv for all cases of using each frequency band (five frequency bands totally). In particular, the classification accuracies averaged across all participants in the alpha band were 65.55% and 50.77% (much higher than the chance level) for the inter-PLI-Conv and intra-PLI-Conv, respectively. According to our results, the emotion category of happiness can be classified with a higher performance compared to the other categories.

Introduction

It is undoubted that emotion is crucial component during human–human interaction and human–machine interaction in our everyday life [1]. Emotion is also relevant to some mental diseases and detrimental habits such as internet addiction, tristimania [2], anxiety and social phobia [3]. Although a great deal of efforts have been made by researchers from diverse disciplines (e.g., neuroscience, psychology, and computer science) to investigate emotion perception, the knowledge we currently acquired is still limited [4], [5]. Most of the prior studies explored emotion perception in a scenario of one person [6], [7], [8], [9], rather than multiple persons who interact each other. In our daily life, people usually perceive emotion during interacting with others [10]. For instance, how you feel yourself and other people during the social interaction [11], [12]. Therefore, it is better to investigate emotion perception during social interaction.

In recent years, hyper-scanning, a technique for measuring brain activities simultaneously from two or more persons, has been utilized to explore brain-to-brain interactions when two or more persons engage in a task [13]. The hyper-scanning could be categorized as fNIRS (functional near infrared spectroscopy) hyper-scanning [14], fMRI (functional magnetic resonance imaging) hyper-scanning [15] and EEG hyper-scanning [16] according to the signal recorded in the experiment. Because EEG is of high temporal resolution and ease of use with an advantage of low cost [17], we employed EEG hyper-scanning in our study. Previous studies revealed that the frontal region was dominantly involved in the emotion recognition, especially facial expression recognition [18], [19], [20]. This hints us to focus on the frontal region for emotion perception exploration in the context of multiple persons. The concentration on a particular region could also benefit the following step of emotion classification as less amount of data are required for processing.

In this study, we designed an EEG hyper-scanning experiment to explore emotion perception in the scenario of two-person interaction. The characteristics of brain activities were first explored to reveal neural mechanisms related to emotion perception. Then, a deep learning model was utilized to classify different emotion categories based on the findings in the data analysis. The framework of our study was depicted in Fig. 1. The main contributions of this paper are (1) designing an EEG hyper-scanning experiment to explore emotion perception in the scenario of two-person interaction; (2) revealing EEG characteristics associated with emotion perception; (3) evaluating the effectiveness of intra-brain and inter-brain phase synchronization features; (4) recognizing emotions based on EEG signal only using twelve electrodes on the frontal region, which could facilitate the development of portable emotion recognition system. The remainder of the paper is organized as follows. The relevant work is introduced in Section 2, which is followed by the description of experiment design and setup in Section 3. Next, the methodological descriptions of the preprocessing, ERP, synchronization feature extraction, convolutional neural networks are given in Section 4, which is followed by the results in Section 5. Finally, conclusions are drawn in Section 6.

Section snippets

Hyper-scanning experiments

Hyper-scanning is a technique that enables to measure signals simultaneously from two or more persons. Montague et al. presented an experiment of deception game played by a pair of participants, which is the first fMRI-based hyper-scanning experiment [15]. About ten years later, the first NIRS-based hyper-scanning was reported and NIRS signals were simultaneously collected from two participants engaging in a cooperation task [21]. Babiloni et al. [22] designed the first EEG-based hyper-scanning

Experiment setup

The objective of our study aimed to investigate neural mechanisms of social emotion perception when two persons engaged in the same task. Since there was a potential cultural bias for Chinese participants using pictures from the International Affective Picture System (IAPS) [39], the native Chinese Affective Picture System (CAPS) has been used in the studies with Chinese participants [40], [41]. Similarly, we used facial pictures from the native CAPS in this study.

Three hundred pictures were

Preprocessing

In the step of signal preprocessing, EEG data were band-pass filtered at the cut-off frequencies of 0.5 Hz and 85 Hz, which was followed by common average reference (CAR) that the magnitude mean averaged across all electrodes was subtracted from the magnitudes of each electrode [45]. Independent Component Analysis (ICA) was subsequently used to obtain independent components, which was implemented in the EEGLAB [46]. The components representing artifacts were detected by using the MARA [47] (a

Event related potential

Fig. 6 shows the ERPs of four emotion categories for the selected twelve channels located on frontal region. It can be seen that the ERPs for anger on the channels FP1, FPZ, FP2, AF3, and AF4 and for disgust on the channels F1, FZ and F4 were larger than the ERPs for happy on those channels. Statistical results were listed in Table 1. P-values are original significance level while Q-values are significance level after multiple comparison correction (using FDR). There were statistically

Conclusion

In this paper, we explored social emotion perception and emotion classification with the channels located on the frontal region. We focus on frontal lobe because (1) it was reported that the frontal lobe plays an important role in emotion perception; (2) from the perspective of practical application of emotion recognition, less number of channels could reduce computational complexity and facilitate the setup. In this study, we investigated social emotion from the aspects (1) time-locked

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgments

The work presented in this paper was supported by the National Natural Science Foundation of China (No. 61633010, 61806149,61673322), and the National Key Basic Research Program of China (No. 2013CB329502). This research was also partially supported by the Ministry of Education and Science of the Russian Federation (grant 14.756.31.0001) and the Polish National Science Center, Poland (grant 2016/20/W/N24/00354).

References (62)

  • MengX. et al.

    Automatic processing of valence differences in emotionally negative stimuli: evidence from an ERP study

    Neurosci. Lett.

    (2009)
  • DelormeA. et al.

    EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis

    J. Neurosci. Methods

    (2004)
  • AcharyaU.R. et al.

    Deep convolutional neural network for the automated detection and diagnosis of seizure using EEG signals

    Comput. Biol. Med.

    (2018)
  • PanticM. et al.

    Toward an affect-sensitive multimodal human-computer interaction

    Proc. IEEE

    (2003)
  • Parry-JonesW.L.

    Historical aspects of mood and its disorders in young people

  • EtkinA. et al.

    Functional neuroimaging of anxiety: a meta-analysis of emotional processing in PTSD, social anxiety disorder, and specific phobia

    Am. J. Psychiatry

    (2007)
  • MaussI.B. et al.

    Measures of emotion: A review

    Cogn. Emot.

    (2009)
  • ZhengW.-L. et al.

    EmotionMeter: A multimodal framework for recognizing human emotions

    IEEE Trans. Cybern.

    (2018)
  • KohlerC.G. et al.

    Facial emotion perception in schizophrenia: a meta-analytic review

    Schizophr. Bull.

    (2009)
  • NiedenthalP.M. et al.

    Embodiment in attitudes, social perception, and emotion

    Pers. Soc. Psychol. Rev.

    (2005)
  • CowieR. et al.

    Emotion recognition in human-computer interaction

    IEEE Signal Process. Mag.

    (2001)
  • LöwithK.

    L. Feuerbach und der Ausgang der klassischen deutschen Philosophie

    Riv. Filos.

    (1928)
  • BarkanS.E.

    Sociology: Understanding and changing the social world

    (2011)
  • Van KleefG.A.

    How emotions regulate social life: The emotions as social information (EASI) model

    Curr. Dir. Psychol. Sci.

    (2009)
  • CutiniS. et al.

    Functional near infrared optical imaging in cognitive neuroscience: an introductory review

    J. Near Infrared Spectrosc.

    (2012)
  • MontagueP.R. et al.

    Hyperscanning: Simultaneous fMRI During Linked Social Interactions

    (2002)
  • SinhaN. et al.

    EEG hyperscanning study of inter-brain synchrony during cooperative and competitive interaction

  • K. Watanabe, M. Kashino, K. Nakazawa, S. Shimojo, Implicit ambient surface information: From personal to interpersonal,...
  • PowellT.R. et al.

    Telomere length as a predictor of emotional processing in the brain

    Hum. Brain Mapp.

    (2018)
  • VaidyaA.R. et al.

    Ventromedial frontal lobe damage affects interpretation, not exploration, of emotional facial expressions

    Cortex

    (2019)
  • DrapeauJ. et al.

    Emotional recognition from dynamic facial, vocal and musical expressions following traumatic brain injury

    Brain Inj.

    (2017)
  • Cited by (16)

    • EEG-based emotion recognition in an immersive virtual reality environment: From local activity to brain network features

      2022, Biomedical Signal Processing and Control
      Citation Excerpt :

      We obtained consistent results from the Spearman correlation analysis (Fig. 8) and random-forest-based feature selection (Fig. 12) that the electrodes located in the frontal and occipital regions played a more important role in emotion recognition. Previous studies have shown that the frontal area dominantly contributed to emotion perception and recognition [76–78]. For example, Heberlein et al. [77] reported that the ventromedial frontal cortex plays a crucial role in facial emotion recognition.

    • Social and Non-social Reward Learning Contexts for Detection of Major Depressive Disorder Using EEG: A Machine Learning Approach

      2023, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    View all citing articles on Scopus
    View full text