当前位置: X-MOL 学术Trans. GIS › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Traffic transformer: Capturing the continuity and periodicity of time series for traffic forecasting
Transactions in GIS ( IF 2.1 ) Pub Date : 2020-06-11 , DOI: 10.1111/tgis.12644
Ling Cai 1 , Krzysztof Janowicz 1 , Gengchen Mai 1 , Bo Yan 2 , Rui Zhu 1
Affiliation  

Traffic forecasting is a challenging problem due to the complexity of jointly modeling spatio‐temporal dependencies at different scales. Recently, several hybrid deep learning models have been developed to capture such dependencies. These approaches typically utilize convolutional neural networks or graph neural networks (GNNs) to model spatial dependency and leverage recurrent neural networks (RNNs) to learn temporal dependency. However, RNNs are only able to capture sequential information in the time series, while being incapable of modeling their periodicity (e.g., weekly patterns). Moreover, RNNs are difficult to parallelize, making training and prediction less efficient. In this work we propose a novel deep learning architecture called Traffic Transformer to capture the continuity and periodicity of time series and to model spatial dependency. Our work takes inspiration from Google’s Transformer framework for machine translation. We conduct extensive experiments on two real‐world traffic data sets, and the results demonstrate that our model outperforms baseline models by a substantial margin.

中文翻译:

交通变压器:捕获时间序列的连续性和周期性以进行交通预测

由于对不同规模的时空依赖性进行联合建模的复杂性,交通预测是一个具有挑战性的问题。最近,已经开发了几种混合深度学习模型来捕获这种依赖性。这些方法通常利用卷积神经网络或图神经网络(GNN)来建模空间依赖性,并利用递归神经网络(RNN)来学习时间依赖性。但是,RNN只能捕获时间序列中的顺序信息,而无法对其周期性(例如,每周模式)进行建模。此外,RNN难以并行化,从而降低了训练和预测的效率。在这项工作中,我们提出了一种新颖的深度学习架构,称为Traffic Transformer捕获时间序列的连续性和周期性并为空间依赖性建模。我们的工作从Google的机器翻译Transformer框架中获得启发。我们对两个现实世界的交通数据集进行了广泛的实验,结果表明我们的模型比基线模型大幅度提高了性能。
更新日期:2020-06-11
down
wechat
bug