当前位置: X-MOL 学术Sci. China Inf. Sci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Learning efficient text-to-image synthesis via interstage cross-sample similarity distillation
Science China Information Sciences ( IF 8.8 ) Pub Date : 2020-11-17 , DOI: 10.1007/s11432-020-2900-x
Fengling Mao , Bingpeng Ma , Hong Chang , Shiguang Shan , Xilin Chen

For a given text, previous text-to-image synthesis methods commonly utilize a multistage generation model to produce images with high resolution in a coarse-to-fine manner. However, these methods ignore the interaction among stages, and they do not constrain the consistent cross-sample relations of images generated in different stages. These deficiencies result in inefficient generation and discrimination. In this study, we propose an interstage cross-sample similarity distillation model based on a generative adversarial network (GAN) for learning efficient text-to-image synthesis. To strengthen the interaction among stages, we achieve interstage knowledge distillation from the refined stage to the coarse stages with novel interstage cross-sample similarity distillation blocks. To enhance the constraint on the cross-sample relations of the images generated at different stages, we conduct cross-sample similarity distillation among the stages. Extensive experiments on the Oxford-102 and Caltech-UCSD Birds-200–2011 (CUB) datasets show that our model generates visually pleasing images and achieves quantitatively comparable performance with state-of-the-art methods.



中文翻译:

通过级间跨样本相似性蒸馏学习有效的文本到图像合成

对于给定的文本,以前的文本到图像的合成方法通常利用多阶段生成模型以从粗到细的方式生成高分辨率的图像。但是,这些方法忽略了阶段之间的相互作用,并且它们不限制在不同阶段生成的图像的一致的跨样本关系。这些缺陷导致效率低下的生成和区分。在这项研究中,我们提出了一个基于生成对抗网络(GAN)的跨阶段跨样本相似性蒸馏模型,用于学习有效的文本到图像合成。为了加强阶段之间的交互,我们使用新型的跨阶段交叉样本相似性蒸馏模块实现了从精炼阶段到粗阶段的阶段间知识蒸馏。为了增强对在不同阶段生成的图像的跨样本关系的约束,我们在各个阶段之间进行跨样本相似性蒸馏。在Oxford-102和Caltech-UCSD Birds-200–2011(CUB)数据集上进行的大量实验表明,我们的模型可以生成视觉上令人愉悦的图像,并且可以通过最新技术实现定量可比的性能。

更新日期:2020-11-21
down
wechat
bug