当前位置: X-MOL 学术Int. J. Intell. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Federated learning with stochastic quantization
International Journal of Intelligent Systems ( IF 5.0 ) Pub Date : 2022-09-02 , DOI: 10.1002/int.23056
Yawen Li 1 , Wenling Li 2 , Zhe Xue 3
Affiliation  

This paper studies the distributed federated learning problem when the exchanged information between the server and the workers is quantized. A novel quantized federated averaging algorithm is developed by applying stochastic quantization scheme to the local and global model parameters. Specifically, the server broadcasts the quantized global model parameter to the workers; the workers update local model parameters using their own data sets and upload the quantized version to the server; then the server updates the global model parameter by aggregating all the quantized local model parameters and its previous global model parameter. This algorithm can be interpreted as a quantized variant of the federated averaging algorithm. The convergence is analyzed theoretically for both convex and strongly convex loss functions with Lipschitz gradient. Extensive experiments using realistic data are provided to show the effectiveness of the proposed algorithm.

中文翻译:

具有随机量化的联邦学习

本文研究了服务器和工人之间交换的信息被量化时的分布式联邦学习问题。通过将随机量化方案应用于局部和全局模型参数,开发了一种新颖的量化联合平均算法。具体来说,服务器将量化后的全局模型参数广播给workers;工作人员使用自己的数据集更新本地模型参数,并将量化版本上传到服务器;然后服务器通过聚合所有量化的局部模型参数及其先前的全局模型参数来更新全局模型参数。该算法可以解释为联邦平均算法的量化变体。从理论上分析了具有 Lipschitz 梯度的凸损失函数和强凸损失函数的收敛性。
更新日期:2022-09-02
down
wechat
bug