当前位置: X-MOL 学术SIAM J. Optim. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Second-Order Guarantees of Distributed Gradient Algorithms
SIAM Journal on Optimization ( IF 3.1 ) Pub Date : 2020-10-22 , DOI: 10.1137/18m121784x
Amir Daneshmand , Gesualdo Scutari , Vyacheslav Kungurtsev

SIAM Journal on Optimization, Volume 30, Issue 4, Page 3029-3068, January 2020.
We consider distributed smooth nonconvex unconstrained optimization over net- works, modeled as a connected graph. We examine the behavior of distributed gradient-based algorithms near strict saddle points. Specifically, we establish that (i) the renowned distributed gradient descent algorithm likely converges to a neighborhood of a second-order stationary (SoS) solution; and (ii) the more recent class of distributed algorithms based on gradient tracking---implementable also over digraphs---likely converges to exact SoS solutions, thus avoiding (strict) saddle points. Furthermore, new convergence rate results for first-order critical points is established for the latter class of algorithms.


中文翻译:

分布梯度算法的二阶保证

SIAM优化杂志,第30卷,第4期,第3029-3068页,2020年1月。
我们考虑将网络上的分布式平滑无凸无约束优化建模为连通图。我们在严格的鞍点附近检查了基于分布式梯度算法的行为。具体来说,我们确定(i)著名的分布式梯度下降算法可能会收敛到二阶平稳(SoS)解决方案的邻域;(ii)基于梯度跟踪的最新一类分布式算法-也可以在图上实现-可能会收敛到精确的SoS解决方案,从而避免了(严格的)鞍点。此外,为后一类算法建立了针对一阶临界点的新收敛速度结果。
更新日期:2020-11-13
down
wechat
bug