当前位置: X-MOL 学术IEEE Trans. Pattern Anal. Mach. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
GPCA: A Probabilistic Framework for Gaussian Process Embedded Channel Attention.
IEEE Transactions on Pattern Analysis and Machine Intelligence ( IF 20.8 ) Pub Date : 2022-10-04 , DOI: 10.1109/tpami.2021.3102955
Jiyang Xie , Zhanyu Ma , Dongliang Chang , Guoqiang Zhang , Jun Guo

Channel attention mechanisms have been commonly applied in many visual tasks for effective performance improvement. It is able to reinforce the informative channels as well as to suppress the useless channels. Recently, different channel attention modules have been proposed and implemented in various ways. Generally speaking, they are mainly based on convolution and pooling operations. In this paper, we propose Gaussian process embedded channel attention (GPCA) module and further interpret the channel attention schemes in a probabilistic way. The GPCA module intends to model the correlations among the channels, which are assumed to be captured by beta distributed variables. As the beta distribution cannot be integrated into the end-to-end training of convolutional neural networks (CNNs) with a mathematically tractable solution, we utilize an approximation of the beta distribution to solve this problem. To specify, we adapt a Sigmoid-Gaussian approximation, in which the Gaussian distributed variables are transferred into the interval [0,1]. The Gaussian process is then utilized to model the correlations among different channels. In this case, a mathematically tractable solution is derived. The GPCA module can be efficiently implemented and integrated into the end-to-end training of the CNNs. Experimental results demonstrate the promising performance of the proposed GPCA module. Codes are available at https://github.com/PRIS-CV/GPCA.

中文翻译:

GPCA:高斯过程嵌入式通道注意力的概率框架。

通道注意机制已普遍应用于许多视觉任务中,以有效提高性能。它能够加强信息渠道并抑制无用渠道。最近,已经提出并以各种方式实现了不同的通道注意模块。一般来说,它们主要基于卷积和池化操作。在本文中,我们提出了高斯过程嵌入式通道注意(GPCA)模块,并以概率的方式进一步解释通道注意方案。GPCA 模块旨在对通道之间的相关性进行建模,这些相关性假定由 beta 分布变量捕获。由于 beta 分布无法通过数学上易于处理的解决方案集成到卷积神经网络 (CNN) 的端到端训练中,我们利用贝塔分布的近似值来解决这个问题。为了具体说明,我们采用 Sigmoid-Gaussian 近似,其中高斯分布变量被转移到区间 [0,1] 中。然后利用高斯过程对不同通道之间的相关性进行建模。在这种情况下,导出了一个数学上易于处理的解决方案。GPCA 模块可以有效地实现并集成到 CNN 的端到端训练中。实验结果证明了所提出的 GPCA 模块具有良好的性能。代码可在 https://github.com/PRIS-CV/GPCA 获得。然后利用高斯过程对不同通道之间的相关性进行建模。在这种情况下,导出了一个数学上易于处理的解决方案。GPCA 模块可以有效地实现并集成到 CNN 的端到端训练中。实验结果证明了所提出的 GPCA 模块具有良好的性能。代码可在 https://github.com/PRIS-CV/GPCA 获得。然后利用高斯过程对不同通道之间的相关性进行建模。在这种情况下,导出了一个数学上易于处理的解决方案。GPCA 模块可以有效地实现并集成到 CNN 的端到端训练中。实验结果证明了所提出的 GPCA 模块具有良好的性能。代码可在 https://github.com/PRIS-CV/GPCA 获得。
更新日期:2021-08-10
down
wechat
bug