当前位置: X-MOL 学术IEEE Trans. Neural Netw. Learn. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A New Concept of Multiple Neural Networks Structure Using Convex Combination.
IEEE Transactions on Neural Networks and Learning Systems ( IF 10.2 ) Pub Date : 2020-02-20 , DOI: 10.1109/tnnls.2019.2962020
Yu Wang , Yue Deng , Yilin Shen , Hongxia Jin

In this article, a new concept of convex-combined multiple neural networks (NNs) structure is proposed. This new approach uses the collective information from multiple NNs to train the model. Based on both theoretical and experimental analyses, the new approach is shown to achieve faster training convergence with a similar or even better test accuracy than a conventional NN structure. Two experiments are conducted to demonstrate the performance of our new structure: the first one is a semantic frame parsing task for spoken language understanding (SLU) on the Airline Travel Information System (ATIS) data set and the other is a handwritten digit recognition task on the Mixed National Institute of Standards and Technology (MNIST) data set. We test this new structure using both the recurrent NN and convolutional NNs through these two tasks. The results of both experiments demonstrate a 4x-8x faster training speed with better or similar performance by using this new concept.

中文翻译:

基于凸组合的神经网络结构新概念。

在本文中,提出了一种新的凸组合多神经网络结构的概念。这种新方法使用来自多个NN的集体信息来训练模型。基于理论和实验分析,新方法被证明可以实现更快的训练收敛,并且测试精度与传统的NN结构相似甚至更好。进行了两个实验,以证明我们新结构的性能:第一个实验是在航空旅行信息系统(ATIS)数据集上进行口语理解(SLU)的语义框架解析任务,第二个实验是在飞机上的手写数字识别任务美国国家标准技术研究院(MNIST)数据集。我们通过这两个任务使用递归神经网络和卷积神经网络来测试这种新结构。
更新日期:2020-02-20
down
wechat
bug