当前位置: X-MOL 学术Neural Process Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Concept Drift Adaptation for Time Series Anomaly Detection via Transformer
Neural Processing Letters ( IF 3.1 ) Pub Date : 2022-08-28 , DOI: 10.1007/s11063-022-11015-0
Chaoyue Ding , Jing Zhao , Shiliang Sun

Time series anomaly detection (TSAD) is an essential task in practical applications, such as data monitoring and network security detection. A common approach for anomaly detection is to use sequential models. As an effective sequence model, Transformer can capture the long-term dependence of the time series and is expected to better complete anomaly detection tasks. However, there are still problems to be addressed when using Transformer for anomaly detection. (1) Failing to adapt to concept drift: The vanilla Transformer assumes that the training and test data come from the same distribution. However, practical situations may often violate this assumption due to the time-varying nature of time-series data that may lead to concept drift problems. (2) High computational complexity: The time complexity of vanilla Transformer in the inference stage increase quadratically with the sequence length L. To solve the first problem, we propose the concept drift adaptation method (CDAM), a kind of distribution adaptation method, to dynamic tuning the learning rate of Transformer. CDAM aims to fully utilize the old concept data to optimize a new model on the new concept data through an online learning strategy. To address the second problem, we propose the root square sparse self-attention, which requires only \(O(L\sqrt{L})\) time complexity. Experimental results on several anomaly detection benchmarks show that our model outperforms many anomaly detection methods, especially in time series with concept drift.



中文翻译:

通过 Transformer 进行时间序列异常检测的概念漂移适应

时间序列异常检测(TSAD)是实际应用中必不可少的任务,例如数据监控和网络安全检测。异常检测的一种常见方法是使用顺序模型。作为一种有效的序列模型,Transformer 可以捕捉到时间序列的长期依赖关系,有望更好地完成异常检测任务。但是,在使用 Transformer 进行异常检测时,仍有一些问题需要解决。(1)无法适应概念漂移:普通 Transformer 假设训练和测试数据来自相同的分布。然而,由于时间序列数据的时变特性可能导致概念漂移问题,实际情况可能经常违反这一假设。(2)计算复杂度高: vanilla Transformer 在推理阶段的时间复杂度随序列长度L二次增加。为了解决第一个问题,我们提出了概念漂移适应方法(CDAM),一种分布适应方法,用于动态调整 Transformer 的学习率。CDAM旨在通过在线学习策略,充分利用旧概念数据,在新概念数据上优化新模型。为了解决第二个问题,我们提出了平方根稀疏自注意,它只需要\(O(L\sqrt{L})\)时间复杂度。几个异常检测基准的实验结果表明,我们的模型优于许多异常检测方法,尤其是在具有概念漂移的时间序列中。

更新日期:2022-08-29
down
wechat
bug