当前位置: X-MOL 学术arXiv.cs.DC › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Decentralized Personalized Federated Min-Max Problems
arXiv - CS - Distributed, Parallel, and Cluster Computing Pub Date : 2021-06-14 , DOI: arxiv-2106.07289
Aleksandr Beznosikov, Vadim Sushko, Abdurakhmon Sadiev, Alexander Gasnikov

Personalized Federated Learning has recently seen tremendous progress, allowing the design of novel machine learning applications preserving privacy of the data used for training. Existing theoretical results in this field mainly focus on distributed optimization under minimization problems. This paper is the first to study PFL for saddle point problems, which cover a broader class of optimization tasks and are thus of more relevance for applications than the minimization. In this work, we consider a recently proposed PFL setting with the mixing objective function, an approach combining the learning of a global model together with local distributed learners. Unlike most of the previous papers, which considered only the centralized setting, we work in a more general and decentralized setup. This allows to design and to analyze more practical and federated ways to connect devices to the network. We present two new algorithms for our problem. A theoretical analysis of the methods is presented for smooth (strongly-)convex-(strongly-)concave saddle point problems. We also demonstrate the effectiveness of our problem formulation and the proposed algorithms on experiments with neural networks with adversarial noise.

中文翻译:

去中心化个性化联合最小-最大问题

个性化联邦学习最近取得了巨大进步,允许设计新颖的机器学习应用程序,从而保护用于训练的数据的隐私。该领域现有的理论成果主要集中在最小化问题下的分布式优化。本文是第一个研究鞍点问题的 PFL,它涵盖了更广泛的优化任务类别,因此与最小化相比,它与应用程序更相关。在这项工作中,我们考虑了最近提出的具有混合目标函数的 PFL 设置,这是一种将全局模型的学习与局部分布式学习器相结合的方法。与之前大多数只考虑中心化设置的论文不同,我们在更通用和去中心化的设置中工作。这允许设计和分析更实用和联合的方式将设备连接到网络。我们为我们的问题提出了两种新算法。对平滑(强)凸-(强)凹鞍点问题的方法进行了理论分析。我们还证明了我们的问题公式和所提出的算法在具有对抗性噪声的神经网络实验中的有效性。
更新日期:2021-06-15
down
wechat
bug