当前位置: X-MOL 学术arXiv.cs.CC › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Acceleration of stochastic methods on the example of decentralized SGD
arXiv - CS - Computational Complexity Pub Date : 2020-11-15 , DOI: arxiv-2011.07585
Trimbach Ekaterina, Rogozin Alexander

In this paper, we present an algorithm for accelerating decentralized stochastic gradient descent. Recently, decentralized stochastic optimization methods have attracted a lot of attention, mainly due to their low iteration cost, data locality and data exchange efficiency. They are generalizations of algorithms such as SGD and Local SGD. An additional important contribution of this work is the additions to the analysis of acceleration of stochastic methods, which allows achieving acceleration in the decentralized case.

中文翻译:

以去中心化 SGD 为例,加速随机方法

在本文中,我们提出了一种加速分散随机梯度下降的算法。最近,分散随机优化方法引起了很多关注,主要是由于其低迭代成本、数据局部性和数据交换效率。它们是 SGD 和 Local SGD 等算法的推广。这项工作的另一个重要贡献是增加了对随机方法加速的分析,这允许在分散情况下实现加速。
更新日期:2020-11-17
down
wechat
bug