当前位置: X-MOL 学术IEEE Trans. Signal Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Over-the-Air Federated Learning From Heterogeneous Data
IEEE Transactions on Signal Processing ( IF 4.6 ) Pub Date : 2021-06-17 , DOI: 10.1109/tsp.2021.3090323
Tomer Sery , Nir Shlezinger , Kobi Cohen , Yonina Eldar

We focus on over-the-air (OTA) Federated Learning (FL), which has been suggested recently to reduce the communication overhead of FL due to the repeated transmissions of the model updates by a large number of users over the wireless channel. In OTA FL, all users simultaneously transmit their updates as analog signals over a multiple access channel, and the server receives a superposition of the analog transmitted signals. However, this approach results in the channel noise directly affecting the optimization procedure, which may degrade the accuracy of the trained model. We develop a Convergent OTA FL (COTAF) algorithm which enhances the common local stochastic gradient descent (SGD) FL algorithm, introducing precoding at the users and scaling at the server, which gradually mitigates the effect of noise. We analyze the convergence of COTAF to the loss minimizing model and quantify the effect of a statistically heterogeneous setup, i.e. when the training data of each user obeys a different distribution. Our analysis reveals the ability of COTAF to achieve a convergence rate similar to that achievable over error-free channels. Our simulations demonstrate the improved convergence of COTAF over vanilla OTA local SGD for training using non-synthetic datasets. Furthermore, we numerically show that the precoding induced by COTAF notably improves the convergence rate and the accuracy of models trained via OTA FL.

中文翻译:


从异构数据中进行空中联合学习



我们重点关注无线 (OTA) 联合学习 (FL),最近有人建议采用这种方法来减少 FL 的通信开销,因为大量用户通过无线信道重复传输模型更新。在OTA FL中,所有用户同时通过多路访问信道将其更新作为模拟信号传输,并且服务器接收模拟传输信号的叠加。然而,这种方法会导致通道噪声直接影响优化过程,这可能会降低训练模型的准确性。我们开发了一种收敛 OTA FL (COTAF) 算法,该算法增强了常见的局部随机梯度下降 (SGD) FL 算法,在用户处引入预编码并在服务器处引入缩放,从而逐渐减轻噪声的影响。我们分析了 COTAF 与损失最小化模型的收敛性,并量化了统计异构设置的效果,即当每个用户的训练数据服从不同的分布时。我们的分析揭示了 COTAF 能够实现与无差错通道相似的收敛速度。我们的模拟表明,在使用非合成数据集进行训练时,COTAF 相对于普通 OTA 本地 SGD 的收敛性有所提高。此外,我们通过数值表明,COTAF 引起的预编码显着提高了通过 OTA FL 训练的模型的收敛速度和准确性。
更新日期:2021-06-17
down
wechat
bug