当前位置: X-MOL 学术Appl. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Image super-resolution via channel attention and spatial attention
Applied Intelligence ( IF 3.4 ) Pub Date : 2021-06-07 , DOI: 10.1007/s10489-021-02464-6
Enmin Lu , Xiaoxiao Hu

Deep convolutional networks have been widely applied in super-resolution (SR) tasks and have achieved excellent performance. However, even though the self-attention mechanism is a hot topic, has not been applied in SR tasks. In this paper, we propose a new attention-based network for more flexible and efficient performance than other generative adversarial network(GAN)-based methods. Specifically, we employ a convolutional block attention module(CBAM) and embed it into a dense block to efficiently exchange information throughout feature maps. Furthermore, we construct our own spatial module with respect to the self-attention mechanism, which not only captures long-distance spatial connections, but also provides more stability for feature extraction. Experimental results demonstrate that our attention-based network improves the performance of visual quality and quantitative evaluations.



中文翻译:

通过通道注意力和空间注意力的图像超分辨率

深度卷积网络已广泛应用于超分辨率(SR)任务并取得了优异的性能。然而,尽管自注意力机制是一个热门话题,但尚未应用于 SR 任务。在本文中,我们提出了一种新的基于注意力的网络,比其他基于生成对抗网络 (GAN) 的方法更灵活、更高效。具体来说,我们采用卷积块注意模块(CBAM)并将其嵌入到密集块中,以在整个特征图中有效地交换信息。此外,我们针对自注意力机制构建了自己的空间模块,不仅可以捕获长距离的空间连接,还可以为特征提取提供更高的稳定性。

更新日期:2021-06-07
down
wechat
bug