当前位置: X-MOL 学术Neurocomputing › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Balanced Spatial Feature Distillation and Pyramid Attention Network for Lightweight Image Super-resolution
Neurocomputing ( IF 5.5 ) Pub Date : 2022-08-10 , DOI: 10.1016/j.neucom.2022.08.053
Garas Gendy , Nabil Sabor , Jingchao Hou , Guanghui He

Recently, the attention mechanism became the key issue for image super-resolution (SR) because it has the ability to extract different features from the image according to the used attention type. Despite the great success of attention-based methods, the conflict among features of different attention types can affect the SR performance. In this paper, we propose an efficient single image SR model called balanced spatial feature distillation and pyramid attention (BSPAN). The idea of BSPAN is based on the trade-off among the extracted features of different attention types. Also, we propose a balanced spatial feature distillation block (BSFDB) as the backbone of BSPAN so that the network can effectively advantage from the different attention features. Two different attention types, namely spatial attention residual feature distillation (SARFD) and classical attention (CA) are considered in the BSFDB to achieve a leveling between them based on the contents of the low-resolution feature map using the balancing attention block. The BSFDB block is designed to improve the SR performance and has lightweight parameters and low computation complexity. Moreover, to further improve the SR performance, the pyramid attention is introduced in the middle of the BSPAN network for extracting long-range features at a variety of locations and scales. Evaluation based on five benchmark datasets, we concluded that the balancing between features of a variety of attention types can effectively improve the SR performance. So, the proposed BSPAN model achieves significant enhancements in comparison with the state-of-the-art and superior visual quality and reconstruction accuracy.



中文翻译:

用于轻量级图像超分辨率的平衡空间特征蒸馏和金字塔注意网络

最近,注意力机制成为图像超分辨率(SR)的关键问题,因为它能够根据使用的注意力类型从图像中提取不同的特征。尽管基于注意力的方法取得了巨大成功,但不同注意力类型的特征之间的冲突会影响 SR 性能。在本文中,我们提出了一种高效的单图像 SR 模型,称为平衡空间特征蒸馏和金字塔注意 (BSPAN)。BSPAN 的思想是基于不同注意力类型的提取特征之间的权衡。此外,我们提出了一个平衡空间特征蒸馏块(BSFDB)作为 BSPAN 的主干,以便网络可以有效地利用不同的注意力特征。两种不同的注意力类型,即在 BSFDB 中考虑了空间注意残差特征蒸馏 (SARFD) 和经典注意 (CA),以使用平衡注意块基于低分辨率特征图的内容来实现它们之间的平衡。BSFDB 块旨在提高 SR 性能,具有轻量级参数和低计算复杂度。此外,为了进一步提高 SR 性能,在 BSPAN 网络中间引入了金字塔注意力,用于提取各种位置和尺度的远程特征。基于五个基准数据集的评估,我们得出结论,各种注意力类型的特征之间的平衡可以有效地提高 SR 性能。所以,

更新日期:2022-08-10
down
wechat
bug