当前位置: X-MOL 学术IEEE Trans. Signal Inf. Process. Over Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Decentralized Federated Learning: Balancing Communication and Computing Costs
IEEE Transactions on Signal and Information Processing over Networks ( IF 3.2 ) Pub Date : 2022-02-14 , DOI: 10.1109/tsipn.2022.3151242
Wei Liu 1 , Li Chen 1 , Wenyi Zhang 1
Affiliation  

Decentralized stochastic gradient descent (SGD) is a driving engine for decentralized federated learning (DFL). The performance of decentralized SGD is jointly influenced by inter-node communications and local updates. In this paper, we propose a general DFL framework, which implements both multiple local updates and multiple inter-node communications periodically, to strike a balance between communication efficiency and model consensus. It can provide a general decentralized SGD analytical framework. We establish strong convergence guarantees for the proposed DFL algorithm without the assumption of convex objectives. The convergence rate of DFL can be optimized to achieve the balance of communication and computing costs under constrained resources. For improving communication efficiency of DFL, compressed communication is further introduced to the proposed DFL as a new scheme, named DFL with compressed communication (C-DFL). The proposed C-DFL exhibits linear convergence for strongly convex objectives. Experiment results based on MNIST and CIFAR-10 datasets illustrate the superiority of DFL over traditional decentralized SGD methods and show that C-DFL further enhances communication efficiency.

中文翻译:

分散式联邦学习:平衡通信和计算成本

分散随机梯度下降(SGD)是分散联邦学习(DFL)的驱动引擎。去中心化 SGD 的性能受节点间通信和本地更新的共同影响。在本文中,我们提出了一个通用的 DFL 框架,该框架同时实现多个本地更新和多个节点间定期通信,以在通信效率和模型共识之间取得平衡。它可以提供一个通用的去中心化SGD分析框架。我们在不假设凸目标的情况下为所提出的 DFL 算法建立了强大的收敛保证。在资源受限的情况下,可以优化 DFL 的收敛速度,实现通信和计算成本的平衡。为了提高DFL的沟通效率,压缩通信作为一种新方案被进一步引入所提出的 DFL,称为带压缩通信的 DFL (C-DFL)。所提出的 C-DFL 对强凸目标具有线性收敛性。基于 MNIST 和 CIFAR-10 数据集的实验结果说明了 DFL 相对于传统分散式 SGD 方法的优越性,并表明 C-DFL 进一步提高了通信效率。
更新日期:2022-02-14
down
wechat
bug