当前位置: X-MOL 学术IET Image Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Multi-head mutual-attention CycleGAN for unpaired image-to-image translation
IET Image Processing ( IF 2.0 ) Pub Date : 2020-09-07 , DOI: 10.1049/iet-ipr.2019.1153
Wei Ji 1 , Jing Guo 1 , Yun Li 2
Affiliation  

The image-to-image translation, i.e. from source image domain to target image domain, has made significant progress in recent years. The most popular method for unpaired image-to-image translation is CycleGAN. However, it always cannot accurately and rapidly learn the key features in target domains. So, the CycleGAN model learns slowly and the translation quality needs to be improved. In this study, a multi-head mutual-attention CycleGAN (MMA-CycleGAN) model is proposed for unpaired image-to-image translation. In MMA-CycleGAN, the cycle-consistency loss and adversarial loss in CycleGAN are still used, but a mutual-attention (MA) mechanism is introduced, which allows attention-driven, long-range dependency modelling between the two image domains. Moreover, to efficiently deal with the large image size, the MA is further improved to the multi-head mutual-attention (MMA) mechanism. On the other hand, domain labels are adopted to simplify the MMA-CycleGAN architecture, so only one generator is required to perform bidirectional translation tasks. Experiments on multiple datasets demonstrate MMA-CycleGAN is able to learn rapidly and obtain photo-realistic images in a shorter time than CycleGAN.

中文翻译:

多头相互关注CycleGAN用于不成对的图像到图像翻译

近年来,图像到图像的转换,即从源图像域到目标图像域的转换取得了重大进展。未配对的图像到图像转换的最流行方法是CycleGAN。但是,它始终无法准确,快速地学习目标域中的关键功能。因此,CycleGAN模型学习缓慢,需要提高翻译质量。在这项研究中,提出了一种多头相互注意的CycleGAN(MMA-CycleGAN)模型,用于不成对的图像到图像的翻译。在MMA-CycleGAN中,仍然使用CycleGAN中的循环一致性损失和对抗性损失,但是引入了相互注意(MA)机制,该机制允许在两个图像域之间进行注意力驱动的远程依赖性建模。此外,为了有效处理大图像,MA进一步改进为多头相互注意(MMA)机制。另一方面,采用域标签来简化MMA-CycleGAN架构,因此仅需要一个生成器即可执行双向翻译任务。在多个数据集上进行的实验表明,与CycleGAN相比,MMA-CycleGAN能够更快地学习并获得逼真的图像。
更新日期:2020-09-08
down
wechat
bug