当前位置: X-MOL 学术Comput. Electr. Eng. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
IncLSTM: Incremental Ensemble LSTM Model towards Time Series Data
Computers & Electrical Engineering ( IF 4.0 ) Pub Date : 2021-04-25 , DOI: 10.1016/j.compeleceng.2021.107156
Huiju Wang , Mengxuan Li , Xiao Yue

Long short-term memory (LSTM) is one of the most widely used recurrent neural network. Traditionally, it adopts an offline batch mode for model training. To be updated with new data, the network has to be re-trained with merged data using both old and new data, which is very time-consuming and causes catastrophic forgetting. To address this issue, we proposed an incremental ensemble LSTM model-IncLSTM, which fuses ensemble learning and transfer learning to implement incremental updating of the model. The experimental results showed that, in average, the proposed method decreases training time by 18.8%, and improves the prediction accuracy by 15.6% compared with the traditional methods. More importantly, the larger the training data size is, the more efficient IncLSTM would be. While updating the new model, current model predicts independently and concurrently, and the switch between current model and new model occurs once the update is completed, which significantly improves the training efficiency of the model.



中文翻译:

IncLSTM:针对时间序列数据的增量集成LSTM模型

长短期记忆(LSTM)是使用最广泛的递归神经网络之一。传统上,它采用离线批处理模式进行模型训练。要使用新数据进行更新,必须使用旧数据和新数据对合并的数据进行重新训练,这非常耗时,并且会导致灾难性的遗忘。为了解决此问题,我们提出了一种增量集成LSTM模型-IncLSTM,该模型融合了集成学习和传递学习以实现模型的增量更新。实验结果表明,与传统方法相比,该方法平均训练时间减少了18.8%,预测精度提高了15.6%。更重要的是,训练数据的大小越大,IncLSTM的效率就越高。在更新新模型时,

更新日期:2021-04-26
down
wechat
bug