当前位置: X-MOL 学术IEEE Signal Proc. Mag. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Decentralized Stochastic Optimization and Machine Learning: A Unified Variance-Reduction Framework for Robust Performance and Fast Convergence
IEEE Signal Processing Magazine ( IF 14.9 ) Pub Date : 2020-05-01 , DOI: 10.1109/msp.2020.2974267
Ran Xin , Soummya Kar , Usman A. Khan

Decentralized methods to solve finite-sum minimization problems are important in many signal processing and machine learning tasks where the data samples are distributed across a network of nodes, and raw data sharing is not permitted due to privacy and/or resource constraints. In this article, we review decentralized stochastic first-order methods and provide a unified algorithmic framework that combines variance reduction with gradient tracking to achieve robust performance and fast convergence. We provide explicit theoretical guarantees of the corresponding methods when the objective functions are smooth and strongly convex and show their applicability to nonconvex problems via numerical experiments. Throughout the article, we provide intuitive illustrations of the main technical ideas by casting appropriate tradeoffs and comparisons among the methods of interest and by highlighting applications to decentralized training of machine learning models.

中文翻译:

去中心化随机优化和机器学习:用于稳健性能和快速收敛的统一方差减少框架

解决有限和最小化问题的分散方法在许多信号处理和机器学习任务中很重要,其中数据样本分布在节点网络中,并且由于隐私和/或资源限制而不允许共享原始数据。在本文中,我们回顾了分散式随机一阶方法,并提供了一个统一的算法框架,该框架将方差减少与梯度跟踪相结合,以实现稳健的性能和快速收敛。当目标函数是光滑且强凸的时,我们为相应方法提供了明确的理论保证,并通过数值实验证明了它们对非凸问题的适用性。在整篇文章中,
更新日期:2020-05-01
down
wechat
bug