当前位置: X-MOL 学术IEEE Trans. Med. Imaging › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Diffusion Kernel Attention Network for Brain Disorder Classification
IEEE Transactions on Medical Imaging ( IF 10.6 ) Pub Date : 2022-04-26 , DOI: 10.1109/tmi.2022.3170701
Jianjia Zhang 1 , Luping Zhou 2 , Lei Wang 3 , Mengting Liu 1 , Dinggang Shen 4
Affiliation  

Constructing and analyzing functional brain networks (FBN) has become a promising approach to brain disorder classification. However, the conventional successive construct-and-analyze process would limit the performance due to the lack of interactions and adaptivity among the subtasks in the process. Recently, Transformer has demonstrated remarkable performance in various tasks, attributing to its effective attention mechanism in modeling complex feature relationships. In this paper, for the first time, we develop Transformer for integrated FBN modeling, analysis and brain disorder classification with rs-fMRI data by proposing a Diffusion Kernel Attention Network to address the specific challenges. Specifically, directly applying Transformer does not necessarily admit optimal performance in this task due to its extensive parameters in the attention module against the limited training samples usually available. Looking into this issue, we propose to use kernel attention to replace the original dot-product attention module in Transformer. This significantly reduces the number of parameters to train and thus alleviates the issue of small sample while introducing a non-linear attention mechanism to model complex functional connections. Another limit of Transformer for FBN applications is that it only considers pair-wise interactions between directly connected brain regions but ignores the important indirect connections. Therefore, we further explore diffusion process over the kernel attention to incorporate wider interactions among indirectly connected brain regions. Extensive experimental study is conducted on ADHD-200 data set for ADHD classification and on ADNI data set for Alzheimer’s disease classification, and the results demonstrate the superior performance of the proposed method over the competing methods.

中文翻译:

用于脑疾病分类的扩散核注意力网络

构建和分析功能性脑网络(FBN)已成为一种很有前途的脑疾病分类方法。然而,由于过程中子任务之间缺乏交互和适应性,传统的连续构建和分析过程会限制性能。最近,Transformer 在各种任务中表现出卓越的性能,这归功于其在复杂特征关系建模方面的有效注意力机制。在本文中,我们首次开发了 Transformer,用于集成 FBN 建模、分析和脑疾病分类与 rs-fMRI 数据,提出了一个扩散内核注意网络来解决特定挑战。具体来说,直接应用 Transformer 不一定能在此任务中获得最佳性能,因为它在注意力模块中的广泛参数与通常可用的有限训练样本相比。针对这个问题,我们建议使用内核注意力来替换 Transformer 中原来的点积注意力模块。这显着减少了要训练的参数数量,从而缓解了小样本问题,同时引入了非线性注意力机制来模拟复杂的功能连接。FBN 应用的 Transformer 的另一个限制是它只考虑直接连接的大脑区域之间的成对交互,而忽略了重要的间接连接。所以,我们进一步探索内核注意力的扩散过程,以纳入间接连接的大脑区域之间更广泛的相互作用。对用于 ADHD 分类的 ADHD-200 数据集和用于阿尔茨海默病分类的 ADNI 数据集进行了广泛的实验研究,结果表明所提出的方法优于竞争方法。
更新日期:2022-04-26
down
wechat
bug