当前位置: X-MOL 学术IEEE Trans. Neural Netw. Learn. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
GopGAN: Gradients Orthogonal Projection Generative Adversarial Network With Continual Learning
IEEE Transactions on Neural Networks and Learning Systems ( IF 10.4 ) Pub Date : 2021-07-16 , DOI: 10.1109/tnnls.2021.3093319
Xiaobin Li 1 , Weiqiang Wang 1
Affiliation  

The generative adversarial networks (GANs) in continual learning suffer from catastrophic forgetting. In continual learning, GANs tend to forget about previous generation tasks and only remember the tasks they just learned. In this article, we present a novel conditional GAN, called the gradients orthogonal projection GAN (GopGAN), which updates the weights in the orthogonal subspace of the space spanned by the representations of training examples, and we also mathematically demonstrate its ability to retain the old knowledge about learned tasks in learning a new task. Furthermore, the orthogonal projection matrix for modulating gradients is mathematically derived and its iterative calculation algorithm for continual learning is given so that training examples for learned tasks do not need to be stored when learning a new task. In addition, a task-dependent latent vector construction is presented and the constructed conditional latent vectors are used as the inputs of generator in GopGAN to avoid the disappearance of orthogonal subspace of learned tasks. Extensive experiments on MNIST, EMNIST, SVHN, CIFAR10, and ImageNet-200 generation tasks show that the proposed GopGAN can effectively cope with the issue of catastrophic forgetting and stably retain learned knowledge.

中文翻译:

GopGAN:具有持续学习的梯度正交投影生成对抗网络

持续学习中的生成对抗网络 (GAN) 遭受灾难性遗忘。在持续学习中,GAN 往往会忘记上一代的任务,只记住他们刚刚学习的任务。在这篇文章中,我们提出了一种新的条件 GAN,称为梯度正交投影 GAN (GopGAN),它更新训练示例表示跨越的空间的正交子空间中的权重,我们还从数学上证明了它保留学习新任务时关于学习任务的旧知识。此外,从数学上推导了用于调制梯度的正交投影矩阵,并给出了用于连续学习的迭代计算算法,使得在学习新任务时不需要存储已学习任务的训练示例。此外,提出了一种任务相关的潜在向量构造,并将构造的条件潜在向量用作 GopGAN 中生成器的输入,以避免学习任务的正交子空间消失。对 MNIST、EMNIST、SVHN、CIFAR10 和 ImageNet-200 生成任务的大量实验表明,所提出的 GopGAN 可以有效地应对灾难性遗忘问题并稳定地保留所学知识。
更新日期:2021-07-16
down
wechat
bug