当前位置: X-MOL 学术J. Am. Stat. Assoc. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Online Covariance Matrix Estimation in Stochastic Gradient Descent
Journal of the American Statistical Association ( IF 3.0 ) Pub Date : 2021-07-21 , DOI: 10.1080/01621459.2021.1933498
Wanrong Zhu 1 , Xi Chen 2 , Wei Biao Wu 1
Affiliation  

Abstract

The stochastic gradient descent (SGD) algorithm is widely used for parameter estimation, especially for huge datasets and online learning. While this recursive algorithm is popular for computation and memory efficiency, quantifying variability and randomness of the solutions has been rarely studied. This article aims at conducting statistical inference of SGD-based estimates in an online setting. In particular, we propose a fully online estimator for the covariance matrix of averaged SGD (ASGD) iterates only using the iterates from SGD. We formally establish our online estimator’s consistency and show that the convergence rate is comparable to offline counterparts. Based on the classic asymptotic normality results of ASGD, we construct asymptotically valid confidence intervals for model parameters. Upon receiving new observations, we can quickly update the covariance matrix estimate and the confidence intervals. This approach fits in an online setting and takes full advantage of SGD: efficiency in computation and memory.



中文翻译:

随机梯度下降中的在线协方差矩阵估计

摘要

随机梯度下降 (SGD) 算法广泛用于参数估计,尤其适用于庞大的数据集和在线学习。虽然这种递归算法在计算和内存效率方面很受欢迎,但很少研究量化解的可变性和随机性。本文旨在在在线环境中对基于 SGD 的估计进行统计推断。特别是,我们提出了一个完全在线的估计器,用于平均 SGD (ASGD) 迭代的协方差矩阵,仅使用来自 SGD 的迭代。我们正式建立在线估计器的一致性,并表明收敛速度与离线估计器相当。基于 ASGD 的经典渐近正态性结果,我们构建了模型参数的渐近有效置信区间。在收到新的观察结果后,我们可以快速更新协方差矩阵估计和置信区间。这种方法适合在线设置并充分利用 SGD:计算和内存效率。

更新日期:2021-07-21
down
wechat
bug