当前位置: X-MOL 学术Neurocomputing › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Fast Training of Deep LSTM Networks with Guaranteed Stability for Nonlinear System Modeling
Neurocomputing ( IF 6 ) Pub Date : 2021-01-01 , DOI: 10.1016/j.neucom.2020.09.030
Wen Yu , Jesus Gonzalez , Xiaoou Li

Abstract Deep recurrent neural networks (RNN), such as LSTM, have many advantages over forward networks for nonlinear system modeling. However, the most used training method, backward propagation through time (BPTT), is very slow. In this paper, by separating the LSTM cell into forward and recurrent models, we give a faster training method than BPTT. The deep LSTM is modified by combining the deep RNN with the multilayer perceptrons (MLP). The backpropagation-like training methods are proposed for the deep RNN and MLP trainings. The stability of these algorithms are demonstrated. The simulation results show that our fast training methods for LSTM are better than the conventional approaches.

中文翻译:

快速训练深度 LSTM 网络并保证非线性系统建模的稳定性

摘要 深度循环神经网络 (RNN),例如 LSTM,在非线性系统建模方面比前向网络具有许多优势。然而,最常用的训练方法,时间反向传播(BPTT),非常慢。在本文中,通过将 LSTM 单元分为前向模型和循环模型,我们给出了比 BPTT 更快的训练方法。通过将深度 RNN 与多层感知器 (MLP) 相结合来修改深度 LSTM。为深度 RNN 和 MLP 训练提出了类似反向传播的训练方法。证明了这些算法的稳定性。仿真结果表明,我们的 LSTM 快速训练方法优于传统方法。
更新日期:2021-01-01
down
wechat
bug