当前位置: X-MOL 学术Knowl. Based Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Hybrid time-aligned and context attention for time series prediction
Knowledge-Based Systems ( IF 8.8 ) Pub Date : 2020-04-23 , DOI: 10.1016/j.knosys.2020.105937
Zhumei Wang , Liang Zhang , Zhiming Ding

Time series forecasting can provide more extensive data support for good decision making. Recently, many deep learning based forecasting models have been proposed, and thus the main problems are how to learn effective historical information and alleviate the influence of error propagation, especially for long-term prediction. In this paper, we present a new attention model base on LSTM encoder–decoder architecture to predict long-term time series. We define a similar scenes of time series, which include periodic pattern and time-nearest pattern, and provide a similar scenes search method. Base on this, we design a hybrid time aligned and context attention model (HTC-Attn), and the former focus on the characteristics of the alignment position, while the latter focus on the context features of specific location in similar scenes. The attention gate is designed to control the absorption degree of two different types in the prediction model. Furthermore, the proposed model use double-layer encoder–decoder structure to learn the trend term and time dependence of time series. Experimental results show that HTC-Attn can effectively maintain long-term dependence and learn detailed in single factor time series prediction tasks, and accuracy consistently outperforms the state-of-the-art baselines at least 2%.



中文翻译:

混合时间对齐和上下文关注的时间序列预测

时间序列预测可以为更好的决策提供更广泛的数据支持。近来,已经提出了许多基于深度学习的预测模型,因此主要问题是如何学习有效的历史信息并减轻误差传播的影响,特别是对于长期预测。在本文中,我们提出了一种基于LSTM编码器-解码器体系结构的新注意力模型,用于预测长期时间序列。我们定义了一个类似的时间序列场景,包括周期性模式和最近时间模式,并提供了一种类似的场景搜索方法。基于此,我们设计了一个混合时间对齐和上下文注意模型(HTC-Attn),前者关注对齐位置的特征,而后者关注相似场景中特定位置的上下文特征。注意门旨在控制预测模型中两种不同类型的吸收程度。此外,提出的模型使用双层编码器-解码器结构来学习趋势项和时间序列的时间依赖性。实验结果表明,HTC-Attn可以有效地保持长期依赖性,并可以在单因素时间序列预测任务中学习详细信息,并且准确性始终比最新基准高出至少2%。

更新日期:2020-04-23
down
wechat
bug