当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
LGGNet: Learning from Local-Global-Graph Representations for Brain-Computer Interface
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2021-05-05 , DOI: arxiv-2105.02786
Yi Ding, Neethu Robinson, Qiuhao Zeng, Cuntai Guan

In this paper, we propose LGG, a neurologically inspired graph neural network, to learn local-global-graph representations from Electroencephalography (EEG) for a Brain-Computer Interface (BCI). A temporal convolutional layer with multi-scale 1D convolutional kernels and kernel-level attention fusion is proposed to learn the temporal dynamics of EEG. Inspired by neurological knowledge of cognitive processes in the brain, we propose local and global graph-filtering layers to learn the brain activities within and between different functional areas of the brain to model the complex relations among them during the cognitive processes. Under the robust nested cross-validation settings, the proposed method is evaluated on the publicly available dataset DEAP, and the classification performance is compared with state-of-the-art methods, such as FBFgMDM, FBTSC, Unsupervised learning, DeepConvNet, ShallowConvNet, EEGNet, and TSception. The results show that the proposed method outperforms all these state-of-the-art methods, and the improvements are statistically significant (p<0.05) in most cases. The source code can be found at: https://github.com/yi-ding-cs/LGG

中文翻译:

LGGNet:从局部全局图形表示中学习脑机接口

在本文中,我们提出了LGG,这是一种受神经学启发的图神经网络,用于从脑电图(EEG)中学习脑-计算机接口(BCI)的局部全局图表示。提出了一种具有多尺度一维卷积核和核级注意融合的时间卷积层,以学习脑电的时间动态。受大脑中认知过程的神经学知识的启发,我们提出了局部和全局图形过滤层,以学习大脑不同功能区域之内和之间的大脑活动,从而在认知过程中对它们之间的复杂关系进行建模。在鲁棒的嵌套交叉验证设置下,对公开数据集DEAP评估提出的方法,并将分类性能与最先进的方法(如FBFgMDM,FBTSC,无监督学习,DeepConvNet,ShallowConvNet,EEGNet和TSception。结果表明,所提出的方法优于所有这些最新方法,并且在大多数情况下,改进具有统计学意义(p <0.05)。可以在以下位置找到源代码:https://github.com/yi-ding-cs/LGG
更新日期:2021-05-07
down
wechat
bug