当前位置: X-MOL 学术IEEE Trans. Affect. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Dual-Branch Dynamic Graph Convolution Based Adaptive TransFormer Feature Fusion Network for EEG Emotion Recognition
IEEE Transactions on Affective Computing ( IF 9.6 ) Pub Date : 8-15-2022 , DOI: 10.1109/taffc.2022.3199075
Mingyi Sun 1 , Weigang Cui 2 , Shuyue Yu 3 , Hongbin Han 4 , Bin Hu 5 , Yang Li 6
Affiliation  

Electroencephalograph (EEG) emotion recognition plays an important role in the brain-computer interface (BCI) field. However, most of recent methods adopted shallow graph neural networks using a single temporal feature, leading to the limited emotion classification performance. Furthermore, the existing methods generally ignore the individual divergence between different subjects, resulting in poor transfer performance. To address these deficiencies, we propose a dual-branch dynamic graph convolution based adaptive transformer feature fusion network with adapter-finetuned transfer learning (DBGC-ATFFNet-AFTL) for EEG emotion recognition. Specifically, a dual-branch graph convolution network (DBGCN) is firstly designed to effectively capture the temporal and spectral characterizations of EEG simultaneously. Second, the adaptive Transformer feature fusion network (ATFFNet) is conducted by integrating the obtained feature maps with the channel-weight unit, leading to significant difference between different channels. Finally, the adapter-finetuned transfer learning method (AFTL) is applied in cross-subject emotion recognition, which proves to be parameter-efficient with few samples of the target subject. The competitive experimental results on three datasets have shown that our proposed method achieves the promising emotion classification performance compared with the state-of-the-art methods. The code of our proposed method will be available at: https://github.com/smy17/DANet.

中文翻译:


基于双分支动态图卷积的自适应 TransFormer 特征融合网络的脑电情绪识别



脑电图(EEG)情绪识别在脑机接口(BCI)领域发挥着重要作用。然而,最近的大多数方法采用使用单一时间特征的浅图神经网络,导致情感分类性能有限。此外,现有方法通常忽略不同主体之间的个体差异,导致迁移性能较差。为了解决这些缺陷,我们提出了一种基于双分支动态图卷积的自适应变换器特征融合网络,具有适配器微调的迁移学习(DBGC-ATFFNet-AFTL),用于脑电图情感识别。具体来说,首先设计了双分支图卷积网络(DBGCN)来有效地同时捕获脑电图的时间和频谱特征。其次,自适应Transformer特征融合网络(ATFFNet)通过将获得的特征图与通道权重单元集成来进行,导致不同通道之间存在显着差异。最后,将适配器微调迁移学习方法(AFTL)应用于跨主题情感识别,事实证明该方法在目标主题样本较少的情况下具有参数效率。三个数据集上的竞争性实验结果表明,与最先进的方法相比,我们提出的方法实现了有希望的情绪分类性能。我们提出的方法的代码可在以下位置获得:https://github.com/smy17/DANet。
更新日期:2024-08-28
down
wechat
bug