Abstract
In this paper, we present a novel method, called four-dimensional convolutional recurrent neural network, which integrating frequency, spatial and temporal information of multichannel EEG signals explicitly to improve EEG-based emotion recognition accuracy. First, to maintain these three kinds of information of EEG, we transform the differential entropy features from different channels into 4D structures to train the deep model. Then, we introduce CRNN model, which is combined by convolutional neural network (CNN) and recurrent neural network with long short term memory (LSTM) cell. CNN is used to learn frequency and spatial information from each temporal slice of 4D inputs, and LSTM is used to extract temporal dependence from CNN outputs. The output of the last node of LSTM performs classification. Our model achieves state-of-the-art performance both on SEED and DEAP datasets under intra-subject splitting. The experimental results demonstrate the effectiveness of integrating frequency, spatial and temporal information of EEG for emotion recognition.
Similar content being viewed by others
References
Akin M (2002) Comparison of wavelet transform and FFT methods in the analysis of EEG signals. J Med Syst 26(3):241–247
Alarcão SM, Fonseca MJ (2017) Emotions recognition using EEG signals: a survey. IEEE Trans Affect Comput 10(3):374–393
Ansari-Asl K, Chanel G, Pun T (2007) A channel selection method for EEG classification in emotion assessment based on synchronization likelihood. In: European signal processing conference (EUSIPCO). IEEE, New York, pp 1241–1245
Aricò P, Borghini G, Flumeri GD, Sciaraffa N, Babiloni F (2018) Passive BCI beyond the lab: current trends and future directions. Physiol Meas 39(8):57
Aricò P, Reynal M, Di Flumeri G et al (2019) How neurophysiological measures can be used to enhance the evaluation of remote tower solutions. Front Hum Neurosci 13:303
Aricò P, Sciaraffa N, Babiloni F (2020) Brain–computer interfaces: toward a daily life employment. Brain Sci. https://doi.org/10.3390/brainsci10030157
Bamdad M, Zarshenas H, Auais MA (2015) Application of BCI systems in neurorehabilitation: a scoping review. Disab Rehab Assist Technol 10(5):355–364
Blankertz B, Acqualagna L, Dähne S, Haufe S, Schultze-Kraft M, Sturm I, Ušćumlic M, Wenzel MA, Curio G, Müller KR (2016) The Berlin brain–computer interface: progress beyond communication and control. Front Neurosci 10:530
Cartocci G, Maglione AG, Vecchiato G, Flumeri GD, Colosimo A, Scorpecci A, Marsella R, Giannantonio S, Malerba P, Borghini G, Aricò P, Babiloni F (2015) Mental workload estimations in unilateral deafened children. In: 2015 37th annual international conference of the IEEE engineering in medicine and biology society (EMBC). IEEE, New York, pp 1654–1657
Chen X, Pan Z, Wang P, Zhang L, Yuan J (2015) EEG oscillations reflect task effects for the change detection in vocal emotion. Cogn Neurodyn 9(3):351–358
Cowie R, Douglas-Cowie E, Tsapatsoulis N, Votsis G, Kollias S, Fellenz W, Taylor JG (2001) Emotion recognition in human–computer interaction. IEEE Signal Process Mag 18(1):32–80
Duan RN, Zhu JY, Lu BL (2013) Differential entropy feature for EEG-based emotion classification. In: 2013 6th international IEEE/EMBS conference on neural engineering (NER). IEEE, New York, pp 81–84
Figueiredo GR, Ripka WL, Romaneli EFR, Ulbricht L (2019) Attentional bias for emotional faces in depressed and non-depressed individuals: an eye-tracking study. In: 2019 41st annual international conference of the IEEE engineering in medicine and biology society (EMBC). IEEE, New York, pp 5419–5422
Fiorini L, Mancioppi G, Semeraro F, Fujita H, Cavallo F (2020) Unsupervised emotional state classification through physiological parameters for social robotics applications. Knowl Based Syst. https://doi.org/10.1016/j.knosys.2019.105217
Frantzidis CA, Bratsas C, Papadelis CL, Konstantinidis E, Pappas C, Bamidis PD (2010) Toward emotion aware computing: an integrated approach using multichannel neurophysiological recordings and affective visual stimuli. IEEE Trans Inf Technol Biomed 14(3):589–597
Garcia-Molina G, Tsoneva T, Nijholt A (2013) Emotional brain–computer interfaces. Int J Auton Adap Commun Syst 6(1):9–25
Goshvarpour A, Goshvarpour A (2019) EEG spectral powers and source localization in depressing, sad, and fun music videos focusing on gender differences. Cogn Neurodyn 13(2):161–173
He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR), pp 770–778
Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780
Hsu YL, Wang JS, Chiang WC, Hung CH (2017) Automatic ECG-based emotion recognition in music listening. IEEE Trans Affect Comput 11(1):85–99
Koelstra S, Muhl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2012) Deap: a dataset for emotion analysis using physiological signals. IEEE Trans Affect Comput 3(1):18–31
Kong WZ, Zhou ZP, Jiang B, Babiloni F, Borghini G (2017) Assessment of driving fatigue based on intra/inter-region phase synchronization. Neurocomputing 219:474–482
Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. In: Advances in neural information processing systems (NIPS), pp 1097–1105
Kroupi E, Yazdani A, Ebrahimi T (2011) EEG correlates of different emotional states elicited during watching music videos. In: International conference on affective computing and intelligent interaction. Springer, Berlin, pp 457–466
Li M, Lu BL (2009) Emotion classification based on gamma-band EEG. In: 2009 annual international conference of the IEEE engineering in medicine and biology society. IEEE, New York, pp 1223–1226
Li JP, Zhang ZX, He HG (2018) Hierarchical convolutional neural networks for EEG-based emotion recognition. Cogn Comput 10(2):368–380
Ma JX, Tang H, Zheng WL, Lu BL (2019) Emotion recognition using multimodal residual LSTM network. In: Proceedings of the 27th ACM international conference on multimedia (MM), pp 176–183
Murugappan M, Rizon M, Nagarajan R, Yaacob S (2010) Inferring of human emotional states using multichannel EEG. Eur J Sci Res 48(2):281–299
Mühl C, Allison B, Nijholt A, Chanel G (2014) A survey of affective brain computer interfaces: principles, state-of-the-art, and challenges. Brain Comput Interfaces 1(2):66–84
Pfurtscheller G, Allison BZ, Brunner C, Bauernfeind G, Solis-Escalante T, Scherer R, Zander TO, Mueller-Putz G, Neuper C, Birbaumer N (2010) The hybrid BCI. Front Hum Neurosci 4:42
Reuderink B, Mühl C, Poel M (2013) Valence, arousal and dominance in the EEG during game play. Int J Auton Adapt Commun Syst 6(1):45–62
Rozgić V, Vitaladevuni SN, Prasad R. Robust EEG emotion classification using segment level decision fusion. In: 2013 IEEE international conference on acoustics, speech and signal processing (ICASSP). IEEE, New York, pp 1286–1290
Song TF, Zheng WM, Song P, Cui Z (2018) EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Trans Affect Comput. https://doi.org/10.1109/TAFFC.2018.2817622
Vansteensel MJ, Jarosiewicz B (2020) Brain–computer interfaces for communication. Handb Clin Neurol 168:67–85
Wang Y, Huang ZY, McCane B, Neo P (2018) EmotioNet: a 3-D convolutional neural network for EEG-based emotion recognition. In: 2018 international joint conference on neural networks (IJCNN). https://doi.org/10.1109/IJCNN.2018.8489715
Yan JJ, Zheng WM, Xu QY, Lu GM, Li HB, Wang B (2016) Sparse kernel reduced-rank regression for bimodal emotion recognition from facial expression and speech. IEEE Trans Multimed 18(7):1319–1329
Yang YL, Wu QF, Fu YZ, Chen XW (2018a) Continuous convolutional neural network with 3D input for EEG-based emotion recognition. In: International conference on neural information processing (ICONIP). Springer, Berlin, pp 433–443
Yang YL, Wu QF, Qiu M, Wang TD, Chen XW (2018b) Emotion recognition from multi-channel EEG through parallel convolutional recurrent neural network. In: 2018 international joint conference on neural networks (IJCNN). https://doi.org/10.1109/IJCNN.2018.8489331
Zeng H, Yang C, Dai GJ, Qin FW, Zhang JH, Kong WZ (2018) EEG classification of driver mental states by deep learning. Cogn Neurodyn 12(6):597–606
Zeng H, Wu ZH, Zhang JM, Yang C, Zhang H, Dai GJ, Kong WZ (2019a) EEG emotion classification using an improved SincNet-based deep learning model. Brain Sci. https://doi.org/10.3390/brainsci9110326
Zeng H, Yang C, Zhang H, Wu ZH, Zhang JM, Dai GJ, Babiloni F, Kong WZ (2019b) A lightGBM-based EEG analysis method for driver mental states classification. Comput Intell Neurosci. https://doi.org/10.1155/2019/3761203
Zhang T, Zheng WM, Cui Z, Zong Y (2018) Spatio-temporal recurrent neural network for emotion recognition. IEEE Trans Cybern 49(3):839–847
Zhang ZX, Wu BW, Schuller B (2019) Attention-augmented end-to-end multi-task learning for emotion prediction from speech. In: IEEE international conference on acoustics, speech and signal processing (ICASSP). IEEE, New York, pp 6705–6709
Zheng WL, Lu BL (2015) Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans Auton Ment Dev 7(3):162–175
Zheng WL, Zhu JY, Lu BL (2017) Identifying stable patterns over time for emotion recognition from EEG. IEEE Trans Affect Comput 10(3):417–429
Acknowledgements
This work is supported by the National Key R&D Program of China (2017YFE0118200 and 2017YFE0116800), NSFC (61633010), key Research and Development Project of Zhejiang Province (2020C04009), Fundamental Research Funds for the Provincial Universities of Zhejiang (GK209907299001-008). The authors also thank the National International Joint Research Center for Brain-Machine Collaborative Intelligence (2017B01020), Key Laboratory of Brain Machine Collaborative Intelligence of Zhejiang Province (2020E10010).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Shen, F., Dai, G., Lin, G. et al. EEG-based emotion recognition using 4D convolutional recurrent neural network. Cogn Neurodyn 14, 815–828 (2020). https://doi.org/10.1007/s11571-020-09634-1
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11571-020-09634-1