当前位置: X-MOL 学术arXiv.cs.DC › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Federated learning with class imbalance reduction
arXiv - CS - Distributed, Parallel, and Cluster Computing Pub Date : 2020-11-23 , DOI: arxiv-2011.11266
Miao Yang, Akitanoshou Wong, Hongbin Zhu, Haifeng Wang, Hua Qian

Federated learning (FL) is a promising technique that enables a large amount of edge computing devices to collaboratively train a global learning model. Due to privacy concerns, the raw data on devices could not be available for centralized server. Constrained by the spectrum limitation and computation capacity, only a subset of devices can be engaged to train and transmit the trained model to centralized server for aggregation. Since the local data distribution varies among all devices, class imbalance problem arises along with the unfavorable client selection, resulting in a slow converge rate of the global model. In this paper, an estimation scheme is designed to reveal the class distribution without the awareness of raw data. Based on the scheme, a device selection algorithm towards minimal class imbalance is proposed, thus can improve the convergence performance of the global model. Simulation results demonstrate the effectiveness of the proposed algorithm.

中文翻译:

减少班级不平衡的联合学习

联合学习(FL)是一种很有前途的技术,它使大量边缘计算设备能够协作地训练全局学习模型。由于隐私问题,设备上的原始数据无法用于集中式服务器。受频谱限制和计算能力的限制,只有设备的一个子集可以用于训练并将训练后的模型传输到集中式服务器进行聚合。由于所有设备之间的本地数据分布都不同,因此会出现类不平衡问题以及不利的客户端选择,从而导致全局模型的收敛速度变慢。在本文中,设计了一种估计方案来揭示类分布,而无需了解原始数据。在该方案的基础上,提出了一种针对类别不平衡最小的设备选择算法,从而可以提高全局模型的收敛性能。仿真结果证明了该算法的有效性。
更新日期:2020-11-25
down
wechat
bug