当前位置: X-MOL 学术arXiv.cs.IT › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Acceleration of Cooperative Least Mean Square via Chebyshev Periodical Successive Over-Relaxation
arXiv - CS - Information Theory Pub Date : 2020-11-24 , DOI: arxiv-2011.11927
Tadashi Wadayama, Satoshi Takabe

A distributed algorithm for least mean square (LMS) can be used in distributed signal estimation and in distributed training for multivariate regression models. The convergence speed of an algorithm is a critical factor because a faster algorithm requires less communications overhead and it results in a narrower network bandwidth. The goal of this paper is to present that use of Chebyshev periodical successive over-relaxation (PSOR) can accelerate distributed LMS algorithms in a naturally manner. The basic idea of Chbyshev PSOR is to introduce index-dependent PSOR factors that control the spectral radius of a matrix governing the convergence behavior of the modified fixed-point iteration. Accelerations of convergence speed are empirically confirmed in a wide range of networks, such as known small graphs (e.g., Karate graph), and random graphs, such as Erdos-Renyi (ER) random graphs and Barabasi-Albert random graphs.

中文翻译:

通过Chebyshev周期连续过度松弛来加速合作最小均方

最小均方(LMS)的分布式算法可用于多元回归模型的分布式信号估计和分布式训练中。算法的收敛速度是一个关键因素,因为更快的算法需要较少的通信开销,并且会导致网络带宽变窄。本文的目的是提出使用Chebyshev定期连续超松弛(PSOR)可以自然地加速分布式LMS算法。Chbyshev PSOR的基本思想是引入依赖索引的PSOR因子,该因子控制矩阵的光谱半径,从而控制修改后的定点迭代的收敛行为。在广泛的网络中(例如已知的小图(例如,空手道图)和随机图),经验证明了收敛速度的加速。
更新日期:2020-11-25
down
wechat
bug