当前位置: X-MOL 学术IEEE Trans. Signal Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Interpretable Generative Adversarial Networks With Exponential Function
IEEE Transactions on Signal Processing ( IF 4.6 ) Pub Date : 2021-06-16 , DOI: 10.1109/tsp.2021.3089285
Rui She , Pingyi Fan , Xiao-Yang Liu , Xiaodong Wang

As for Generative Adversarial Networks (GANs), its interpretability may be closely related to optimization objective functions, that is, information metrics play important roles in networks training and data generation. In terms of original GAN, the objective function based on Kullback-Leibler (KL) divergence has limitations on the performance of data training and generation. Therefore, it is significant to investigate objective functions for the optimization in GANs to bring gains on the efficiency of network learning from the perspective of metrics. In this paper, the objective function with exponential form, referred from the Message Importance Measure (MIM), is adapted to replace that with logarithm form in the optimization for adversarial networks. This approach named MIM-based GAN, may provide more hidden information in terms of interpretability on training process and probability events generation. Specifically, we first analyze the intrinsic relationship between the proposed approach and other classical GANs. Moreover, compared with the original GAN, LSGAN and WGAN, we discuss its advantages on training performance in theory including sensitivity, convergence rate and so on. In addition, we do simulations on the datasets to confirm why the MIM-based GAN achieves state-of-the-art performance on training process and data generation.

中文翻译:


具有指数函数的可解释生成对抗网络



对于生成对抗网络(GAN),其可解释性可能与优化目标函数密切相关,即信息度量在网络训练和数据生成中发挥着重要作用。就原始GAN而言,基于Kullback-Leibler(KL)散度的目标函数在数据训练和生成的性能上存在局限性。因此,从度量的角度研究 GAN 优化的目标函数以提高网络学习的效率具有重要意义。本文采用消息重要性度量(MIM)中引用的指数形式的目标函数来代替对数形式的目标函数来优化对抗网络。这种称为基于 MIM 的 GAN 的方法可以在训练过程和概率事件生成的可解释性方面提供更多隐藏信息。具体来说,我们首先分析所提出的方法与其他经典 GAN 之间的内在关系。此外,与原始GAN、LSGAN和WGAN相比,我们从理论上讨论了其在灵敏度、收敛速度等训练性能上的优势。此外,我们对数据集进行了模拟,以确认为什么基于 MIM 的 GAN 在训练过程和数据生成方面实现了最先进的性能。
更新日期:2021-06-16
down
wechat
bug