当前位置: X-MOL 学术IMA J. Numer. Anal. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Adaptive restart of accelerated gradient methods under local quadratic growth condition
IMA Journal of Numerical Analysis ( IF 2.3 ) Pub Date : 2019-03-05 , DOI: 10.1093/imanum/drz007
Olivier Fercoq 1 , Zheng Qu 2
Affiliation  

By analyzing accelerated proximal gradient methods under a local quadratic growth condition, we show that restarting these algorithms at any frequency gives a globally linearly convergent algorithm. This result was previously known only for long enough frequencies. Then as the rate of convergence depends on the match between the frequency and the quadratic error bound, we design a scheme to automatically adapt the frequency of restart from the observed decrease of the norm of the gradient mapping. Our algorithm has a better theoretical bound than previously proposed methods for the adaptation to the quadratic error bound of the objective. We illustrate the efficiency of the algorithm on Lasso, regularized logistic regression and total variation denoising problems.

中文翻译:

局部二次增长条件下加速梯度方法的自适应重启

通过分析局部二次增长条件下的加速近端梯度方法,我们表明以任意频率重新启动这些算法可得到全局线性收敛算法。以前仅在足够长的频率下才知道此结果。然后,由于收敛速度取决于频率与二次误差范围之间的匹配,因此我们设计了一种方案,可根据观察到的梯度映射范数的减少自动调整重启频率。与以前提出的方法相比,我们的算法具有更好的理论界限,可以适应目标的二次误差界限。我们说明了该算法在套索,正则逻辑回归和总变异去噪问题上的效率。
更新日期:2020-04-17
down
wechat
bug