当前位置: X-MOL 学术arXiv.cs.IT › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Federated Generalized Bayesian Learning via Distributed Stein Variational Gradient Descent
arXiv - CS - Information Theory Pub Date : 2020-09-11 , DOI: arxiv-2009.06419
Rahif Kassab and Osvaldo Simeone

This paper introduces Distributed Stein Variational Gradient Descent (DSVGD), a non-parametric generalized Bayesian inference framework for federated learning. DSVGD maintains a number of non-random and interacting particles at a central server to represent the current iterate of the model global posterior. The particles are iteratively downloaded and updated by one of the agents with the end goal of minimizing the global free energy. By varying the number of particles, DSVGD enables a flexible trade-off between per-iteration communication load and number of communication rounds. DSVGD is shown to compare favorably to benchmark frequentist and Bayesian federated learning strategies, also scheduling a single device per iteration, in terms of accuracy and scalability with respect to the number of agents, while also providing well-calibrated, and hence trustworthy, predictions.

中文翻译:

通过分布式斯坦因变分梯度下降联合广义贝叶斯学习

本文介绍了分布式 Stein Variational Gradient Descent (DSVGD),一种用于联邦学习的非参数广义贝叶斯推理框架。DSVGD 在中央服务器上维护了许多非随机和相互作用的粒子,以表示模型全局后验的当前迭代。粒子由其中一个代理迭代下载和更新,最终目标是最小化全局自由能。通过改变粒子的数量,DSVGD 实现了每次迭代通信负载和通信轮数之间的灵活权衡。DSVGD 被证明与基准频率论者和贝叶斯联合学习策略相比,在准确性和可扩展性方面,每次迭代也调度一个设备,同时还提供经过良好校准的,
更新日期:2020-11-20
down
wechat
bug