当前位置: X-MOL 学术Mobile Netw. Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Adaptive Clustered Federated Learning for Heterogeneous Data in Edge Computing
Mobile Networks and Applications ( IF 3.8 ) Pub Date : 2022-05-05 , DOI: 10.1007/s11036-022-01978-8
Biyao Gong 1 , Tianzhang Xing 1 , Junfeng Wang 1 , Xiuya Liu 1 , Zhidan Liu 2
Affiliation  

Although federated learning has been widely used in collaborative training of machine learning models, its practical uses are still challenged by heterogeneous data across clients. To alleviate the impact of non-IID data issue, we present an adaptive clustered federated learning approach, \(\mathtt {AdaCFL}\), which can classify clients into suitable clusters according to their local data distribution and train a specialized model for the clients of each cluster. By exploiting the implicit connection between local model weights and data distribution on clients, \(\mathtt {AdaCFL}\) relies on partial selected model weights to measure the data similarity between clients and adaptively groups them into the optimal number of clusters. Experimental results on three benchmark datasets with various non-IID data settings demonstrate that \(\mathtt {AdaCFL}\) achieves comparably high model accuracy as the state-of-the-art works, yet with a significant reduction on the communication cost.



中文翻译:

边缘计算中异构数据的自适应集群联合学习

尽管联邦学习已广泛用于机器学习模型的协同训练,但其实际应用仍然受到跨客户端异构数据的挑战。为了减轻非 IID 数据问题的影响,我们提出了一种自适应集群联邦学习方法\(\mathtt {AdaCFL}\),它可以根据客户端的本地数据分布将客户端分类到合适的集群中,并为每个集群的客户端。通过利用本地模型权重和客户端数据分布之间的隐式联系,\(\mathtt {AdaCFL}\)依赖于部分选择的模型权重来衡量客户端之间的数据相似性,并将它们自适应地分组到最佳数量的集群中。在具有各种非 IID 数据设置的三个基准数据集上的实验结果表明,\(\mathtt {AdaCFL}\)实现了与最先进的作品相当的高模型精度,同时显着降低了通信成本。

更新日期:2022-05-06
down
wechat
bug