当前位置: X-MOL 学术Optimization › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
New nonasymptotic convergence rates of stochastic proximal point algorithm for stochastic convex optimization
Optimization ( IF 2.2 ) Pub Date : 2020-05-24 , DOI: 10.1080/02331934.2020.1761364
Andrei Pătraşcu 1
Affiliation  

Large sectors of the recent optimization literature focused in the last decade on the development of optimal stochastic first-order schemes for constrained convex models under progressively relaxed assumptions. Stochastic proximal point is an iterative scheme born from the adaptation of proximal point algorithm to noisy stochastic optimization, with a resulting iteration related to stochastic alternating projections. Inspired by the scalability of alternating projection methods, we start from the (linear) regularity assumption, typically used in convex feasiblity problems to guarantee the linear convergence of stochastic alternating projection methods, and analyze a general weak linear regularity condition which facilitates convergence rate boosts in stochastic proximal point schemes. Our applications include many non-strongly convex functions classes often used in machine learning and statistics. Moreover, under weak linear regularity assumption we guarantee O1k convergence rate for SPP, in terms of the distance to the optimal set, using only projections onto a simple component set. Linear convergence is obtained for interpolation setting, when the optimal set of the expected cost is included into the optimal sets of each functional component.



中文翻译:

用于随机凸优化的随机近点算法的新非渐近收敛率

在过去十年中,最近优化文献的大部分集中在逐渐放松假设下为约束凸模型开发最优随机一阶方案。随机近端点是一种迭代方案,源于近端点算法对噪声随机优化的适应,其结果迭代与随机交替投影相关。受交替投影方法的可扩展性的启发,我们从(线性)正则性假设开始,通常用于凸可行性问题以保证随机交替投影方法的线性收敛,并分析了有助于收敛速度提升的一般弱线性正则性条件随机近点方案。我们的应用程序包括许多常用于机器学习和统计的非强凸函数类。此外,在弱线性正则假设下,我们保证1SPP 的收敛速度,就到最佳集的距离而言,仅使用对简单组件集的投影。当期望成本的最优集合被包含在每个函数组件的最优集合中时,对于插值设置获得线性收敛。

更新日期:2020-05-24
down
wechat
bug