当前位置: X-MOL 学术arXiv.cs.PF › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
FedScale: Benchmarking Model and System Performance of Federated Learning
arXiv - CS - Performance Pub Date : 2021-05-24 , DOI: arxiv-2105.11367
Fan Lai, Yinwei Dai, Xiangfeng Zhu, Mosharaf Chowdhury

We present FedScale, a diverse set of challenging and realistic benchmark datasets to facilitate scalable, comprehensive, and reproducible federated learning (FL) research. FedScale datasets are large-scale, encompassing a diverse range of important FL tasks, such as image classification, object detection, language modeling, speech recognition, and reinforcement learning. For each dataset, we provide a unified evaluation protocol using realistic data splits and evaluation metrics. To meet the pressing need for reproducing realistic FL at scale, we have also built an efficient evaluation platform to simplify and standardize the process of FL experimental setup and model evaluation. Our evaluation platform provides flexible APIs to implement new FL algorithms and include new execution backends with minimal developer efforts. Finally, we perform indepth benchmark experiments on these datasets. Our experiments suggest that FedScale presents significant challenges of heterogeneity-aware co-optimizations of the system and statistical efficiency under realistic FL characteristics, indicating fruitful opportunities for future research. FedScale is open-source with permissive licenses and actively maintained, and we welcome feedback and contributions from the community.

中文翻译:

FedScale:联合学习的基准模型和系统性能

我们展示了FedScale,它是一组具有挑战性和现实意义的基准数据集,旨在促进可扩展的,全面的和可再现的联邦学习(FL)研究。FedScale数据集是大规模的,涵盖了各种重要的FL任务,例如图像分类,对象检测,语言建模,语音识别和强化学习。对于每个数据集,我们提供了使用实际数据拆分和评估指标的统一评​​估协议。为了满足大规模再现逼真的FL的迫切需求,我们还构建了一个有效的评估平台,以简化和标准化FL实验设置和模型评估的过程。我们的评估平台提供了灵活的API,以实现新的FL算法,并包括开发人员所需的最少工作量的新执行后端。最后,我们对这些数据集进行了深入的基准测试。我们的实验表明,在现实的FL特性下,FedScale对系统的异质性协同优化和统计效率提出了重大挑战,为未来的研究提供了丰硕的机会。FedScale是开源的,具有宽松的许可证,并得到积极维护,我们欢迎社区的反馈和贡献。
更新日期:2021-05-25
down
wechat
bug