当前位置: X-MOL 学术arXiv.cs.GT › 论文详情
Explore Aggressively, Update Conservatively: Stochastic Extragradient Methods with Variable Stepsize Scaling
arXiv - CS - Computer Science and Game Theory Pub Date : 2020-03-23 , DOI: arxiv-2003.10162
Yu-Guan Hsieh; Franck Iutzeler; Jérôme Malick; Panayotis Mertikopoulos

Owing to their stability and convergence speed, extragradient methods have become a staple for solving large-scale saddle-point problems in machine learning. The basic premise of these algorithms is the use of an extrapolation step before performing an update; thanks to this exploration step, extra-gradient methods overcome many of the non-convergence issues that plague gradient descent/ascent schemes. On the other hand, as we show in this paper, running vanilla extragradient with stochastic gradients may jeopardize its convergence, even in simple bilinear models. To overcome this failure, we investigate a double stepsize extragradient algorithm where the exploration step evolves at a more aggressive time-scale compared to the update step. We show that this modification allows the method to converge even with stochastic gradients, and we derive sharp convergence rates under an error bound condition.
更新日期:2020-03-24

 

全部期刊列表>>
全球疫情及响应:BMC Medicine专题征稿
欢迎探索2019年最具下载量的化学论文
新版X-MOL期刊搜索和高级搜索功能介绍
化学材料学全球高引用
ACS材料视界
南方科技大学
x-mol收录
南方科技大学
自然科研论文编辑服务
上海交通大学彭文杰
中国科学院长春应化所于聪-4-8
武汉工程大学
课题组网站
X-MOL
深圳大学二维材料实验室张晗
中山大学化学工程与技术学院
试剂库存
天合科研
down
wechat
bug