当前位置: X-MOL 学术arXiv.cs.GT › 论文详情
On the convergence of single-call stochastic extra-gradient methods
arXiv - CS - Computer Science and Game Theory Pub Date : 2019-08-22 , DOI: arxiv-1908.08465
Yu-Guan Hsieh; Franck Iutzeler; Jérôme Malick; Panayotis Mertikopoulos

Variational inequalities have recently attracted considerable interest in machine learning as a flexible paradigm for models that go beyond ordinary loss function minimization (such as generative adversarial networks and related deep learning systems). In this setting, the optimal $\mathcal{O}(1/t)$ convergence rate for solving smooth monotone variational inequalities is achieved by the Extra-Gradient (EG) algorithm and its variants. Aiming to alleviate the cost of an extra gradient step per iteration (which can become quite substantial in deep learning applications), several algorithms have been proposed as surrogates to Extra-Gradient with a \emph{single} oracle call per iteration. In this paper, we develop a synthetic view of such algorithms, and we complement the existing literature by showing that they retain a $\mathcal{O}(1/t)$ ergodic convergence rate in smooth, deterministic problems. Subsequently, beyond the monotone deterministic case, we also show that the last iterate of single-call, \emph{stochastic} extra-gradient methods still enjoys a $\mathcal{O}(1/t)$ local convergence rate to solutions of \emph{non-monotone} variational inequalities that satisfy a second-order sufficient condition.
更新日期:2020-02-12

 

全部期刊列表>>
化学/材料学中国作者研究精选
ACS材料视界
南京大学
自然科研论文编辑服务
剑桥大学-
中国科学院大学化学科学学院
南开大学化学院周其林
课题组网站
X-MOL
北京大学分子工程苏南研究院
华东师范大学分子机器及功能材料
中山大学化学工程与技术学院
试剂库存
天合科研
down
wechat
bug