当前位置: X-MOL 学术Stat. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Sequential changepoint detection in neural networks with checkpoints
Statistics and Computing ( IF 2.2 ) Pub Date : 2022-03-13 , DOI: 10.1007/s11222-022-10088-0
Michalis K. Titsias 1 , Jakub Sygnowski 1 , Yutian Chen 1
Affiliation  

We introduce a framework for online changepoint detection and simultaneous model learning which is applicable to highly parametrized models, such as deep neural networks. It is based on detecting changepoints across time by sequentially performing generalized likelihood ratio tests that require only evaluations of simple prediction score functions. This procedure makes use of checkpoints, consisting of early versions of the actual model parameters, that allow to detect distributional changes by performing predictions on future data. We define an algorithm that bounds the Type I error in the sequential testing procedure. We demonstrate the efficiency of our method in challenging continual learning applications with unknown task changepoints, and show improved performance compared to online Bayesian changepoint detection.



中文翻译:

具有检查点的神经网络中的顺序变化点检测

我们介绍了一个用于在线变化点检测和同步模型学习的框架,该框架适用于高度参数化的模型,例如深度神经网络。它基于通过顺序执行仅需要评估简单预测得分函数的广义似然比测试来检测跨时间的变化点。此过程使用检查点,包括实际模型参数的早期版本,允许通过对未来数据执行预测来检测分布变化。我们定义了一个算法来限制顺序测试过程中的 I 类错误。我们展示了我们的方法在挑战具有未知任务变化点的持续学习应用程序中的效率,并且与在线贝叶斯变化点检测相比表现出改进的性能。

更新日期:2022-03-13
down
wechat
bug