当前位置: X-MOL 学术International Journal of Forecasting › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Temporal Fusion Transformers for interpretable multi-horizon time series forecasting
International Journal of Forecasting ( IF 6.9 ) Pub Date : 2021-06-16 , DOI: 10.1016/j.ijforecast.2021.03.012
Bryan Lim , Sercan Ö. Arık , Nicolas Loeff , Tomas Pfister

Multi-horizon forecasting often contains a complex mix of inputs – including static (i.e. time-invariant) covariates, known future inputs, and other exogenous time series that are only observed in the past – without any prior information on how they interact with the target. Several deep learning methods have been proposed, but they are typically ‘black-box’ models that do not shed light on how they use the full range of inputs present in practical scenarios. In this paper, we introduce the Temporal Fusion Transformer (TFT) – a novel attention-based architecture that combines high-performance multi-horizon forecasting with interpretable insights into temporal dynamics. To learn temporal relationships at different scales, TFT uses recurrent layers for local processing and interpretable self-attention layers for long-term dependencies. TFT utilizes specialized components to select relevant features and a series of gating layers to suppress unnecessary components, enabling high performance in a wide range of scenarios. On a variety of real-world datasets, we demonstrate significant performance improvements over existing benchmarks, and highlight three practical interpretability use cases of TFT.



中文翻译:

用于可解释多水平时间序列预测的时间融合变换器

多层面预测通常包含复杂的输入组合——包括静态(即时间不变)协变量、已知的未来输入和其他仅在过去观察到的外生时间序列——没有关于它们如何与目标交互的任何先验信息. 已经提出了几种深度学习方法,但它们通常是“黑盒”模型,无法说明它们如何使用实际场景中存在的全部输入。在本文中,我们介绍了时间融合变换器 (TFT)——一种新颖的基于注意力的架构,它将高性能多水平预测与对时间动态的可解释见解相结合。为了学习不同尺度的时间关系,TFT 使用循环层进行局部处理,并使用可解释的自我注意层进行长期依赖。TFT 使用专门的组件来选择相关特征和一系列门控层来抑制不必要的组件,从而在广泛的场景中实现高性能。在各种真实世界的数据集上,我们展示了相对于现有基准的显着性能改进,并突出了 TFT 的三个实际可解释性用例。

更新日期:2021-06-16
down
wechat
bug