当前位置: X-MOL 学术Knowl. Based Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Improving the generalization performance of deep networks by dual pattern learning with adversarial adaptation
Knowledge-Based Systems ( IF 7.2 ) Pub Date : 2020-05-12 , DOI: 10.1016/j.knosys.2020.106016
Haimin Zhang , Min Xu

In this paper, we present a dual pattern learning network architecture with adversarial adaptation (DPLAANet). Unlike conventional networks, the proposed network has two input branches and two loss functions. This architecture forces the network to learn robust features by analysing dual inputs. The dual input structure allows the network to have a considerably large number of image pairs, which can help address the overfitting issue due to limited training data. In addition, we propose to associate the two input branches with two random interest values during training. As a stochastic regularization technique, this method can improve the generalization performance. Moreover, we introduce to use the adversarial training approach to reduce the domain difference between fused image features and single image features. Extensive experiments on CIFAR-10, CIFAR-100, FI-8, the Google commands dataset, and MNIST demonstrate that our DPLAANets exhibit better performance than the baseline networks. The experimental results on subsets of CIFAR-10, CIFAR-100, and MNIST demonstrate that DPLAANets have a good generalization performance on small datasets. The proposed architecture can be easily extended to have more than two input branches. The experimental results on subsets of MNIST show that the architecture with three branches outperforms two branches when the training set is extremely small.



中文翻译:

通过对抗性适应的双模式学习提高深度网络的泛化性能

在本文中,我们提出了一种具有对抗适应性的双模式学习网络架构(DPLAANet)。与常规网络不同,建议的网络具有两个输入分支和两个损失函数。这种架构通过分析双输入迫使网络学习强大的功能。双输入结构允许网络具有大量图像对,这可以帮助解决由于训练数据有限而导致的过拟合问题。此外,我们建议在训练过程中将两个输入分支与两个随机兴趣值相关联。作为一种随机正则化技术,该方法可以提高泛化性能。此外,我们介绍了使用对抗训练方法来减少融合图像特征和单个图像特征之间的域差异。在CIFAR-10,CIFAR-100,FI-8,Google命令数据集和MNIST上进行的大量实验表明,我们的DPLAANets表现出比基准网络更好的性能。对CIFAR-10,CIFAR-100和MNIST的子集的实验结果表明,DPLAANets在小型数据集上具有良好的泛化性能。所提议的体系结构可以轻松扩展为具有两个以上的输入分支。在MNIST子集上的实验结果表明,当训练集非常小时,具有三个分支的体系结构优于两个分支。MNIST和MNIST证明DPLAANets在小型数据集上具有良好的泛化性能。所提议的体系结构可以轻松扩展为具有两个以上的输入分支。在MNIST子集上的实验结果表明,当训练集非常小时,具有三个分支的体系结构优于两个分支。MNIST和MNIST证明DPLAANets在小型数据集上具有良好的泛化性能。所提议的体系结构可以轻松扩展为具有两个以上的输入分支。在MNIST子集上的实验结果表明,当训练集非常小时,具有三个分支的体系结构优于两个分支。

更新日期:2020-05-12
down
wechat
bug