当前位置: X-MOL 学术Optim. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Linear convergence of cyclic SAGA
Optimization Letters ( IF 1.6 ) Pub Date : 2020-01-04 , DOI: 10.1007/s11590-019-01520-y
Youngsuk Park , Ernest K. Ryu

In this work, we present and analyze C-SAGA, a (deterministic) cyclic variant of SAGA. C-SAGA is an incremental gradient method that minimizes a sum of differentiable convex functions by cyclically accessing their gradients. Even though the theory of stochastic algorithms is more mature than that of cyclic counterparts in general, practitioners often prefer cyclic algorithms. We prove C-SAGA converges linearly under the standard assumptions. Then, we compare the rate of convergence with the full gradient method, (stochastic) SAGA, and incremental aggregated gradient (IAG), theoretically and experimentally.

中文翻译:

循环SAGA的线性收敛

在这项工作中,我们介绍并分析了C-SAGA(SAGA的(确定性)循环变体)。C-SAGA是一种增量梯度方法,通过循环访问梯度来最小化可微凸函数之和。尽管一般而言,随机算法的理论比循环算法的理论更为成熟,但从业人员通常更喜欢循环算法。我们证明C-SAGA在标准假设下线性收敛。然后,我们在理论上和实验上比较了全梯度法,(随机)SAGA和增量聚集梯度(IAG)的收敛速度。
更新日期:2020-01-04
down
wechat
bug