当前位置: X-MOL 学术Pattern Recogn. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Memory-based Transformer with shorter window and longer horizon for multivariate time series forecasting
Pattern Recognition Letters ( IF 3.9 ) Pub Date : 2022-05-11 , DOI: 10.1016/j.patrec.2022.05.010
Yang Liu 1, 2 , Zheng Wang 1, 2 , Xinyang Yu 1, 2 , Xin Chen 3 , Meijun Sun 1, 2
Affiliation  

Multivariate time series forecasting is an important problem that spans many fields. One challenge of this problem is the complex and non-linear interdependence between time steps and different variables. Recent studies have shown that Transformer has potential in capturing long-term dependencies. However, in the field of time series forecasting, Transformer still has some problems to solve, such as prediction fragmentation and insensitivity to data scale. In addition, traditional forecasting models often require a large amount of input data to support the training of the model when predicting long-term data. However, it is hard to provide sufficient time series input data due to equipment damage or weather situation. To solve these limitations, a memory-based Transformer with shorter window and longer horizon is proposed, called SWLHT. It uses the memory mechanism to make the model no longer only rely on a single input, but can combine the previous forecast results to assist in capturing long-term dependencies, thereby avoiding the requirement of excessively long input sequence. Furthermore, the memory mechanism can alleviate the prediction fragmentation to some extent. The experimental results and comparison of baselines on several real-world multivariate time series datasets have verified the effectiveness of the proposed model.



中文翻译:

用于多元时间序列预测的具有更短窗口和更长视野的基于内存的 Transformer

多元时间序列预测是一个跨越许多领域的重要问题。这个问题的一个挑战是时间步长和不同变量之间的复杂和非线性的相互依赖。最近的研究表明,Transformer 具有捕获长期依赖关系的潜力。但是,在时间序列预测领域,Transformer 还存在一些问题需要解决,比如预测碎片化、对数据规模不敏感等。此外,传统的预测模型在预测长期数据时,往往需要大量的输入数据来支持模型的训练。然而,由于设备损坏或天气情况,很难提供足够的时间序列输入数据。为了解决这些限制,提出了一种基于内存的具有更短窗口和更长视野的 Transformer,称为 SWLHT。它利用记忆机制使模型不再仅仅依赖于单个输入,而是可以结合之前的预测结果来辅助捕捉长期依赖,从而避免输入序列过长的需求。此外,内存机制可以在一定程度上缓解预测碎片。在几个真实世界的多元时间序列数据集上的实验结果和基线比较验证了所提出模型的有效性。

更新日期:2022-05-11
down
wechat
bug