当前位置: X-MOL 学术IEEE Trans. Signal Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Linearized ADMM Converges to Second-Order Stationary Points for Non-Convex Problems
IEEE Transactions on Signal Processing ( IF 4.6 ) Pub Date : 2021-08-02 , DOI: 10.1109/tsp.2021.3100976
Songtao Lu , Jason Lee , Meisam Razaviyayn , Mingyi Hong

In this work, a gradient-based primal-dual method of multipliers is proposed for solving a class of linearly constrained non-convex problems. We show that with random initialization of the primal and dual variables, the algorithm is able to compute second-order stationary points (SOSPs) with probability one. Further, we present applications of the proposed method in popular signal processing and machine learning problems such as decentralized matrix factorization and decentralized training of overparameterized neural networks. One of the key steps in the analysis is to construct a new loss function for these problems such that the required convergence conditions (especially the gradient Lipschitz conditions) can be satisfied without changing the global optimal points.

中文翻译:


非凸问题的线性化 ADMM 收敛到二阶驻点



在这项工作中,提出了一种基于梯度的原对偶乘法器方法来解决一类线性约束非凸问题。我们证明,通过随机初始化原始变量和对偶变量,该算法能够以概率 1 计算二阶驻点(SOSP)。此外,我们还介绍了所提出的方法在流行的信号处理和机器学习问题中的应用,例如去中心化矩阵分解和超参数化神经网络的去中心化训练。分析的关键步骤之一是针对这些问题构造新的损失函数,使得在不改变全局最优点的情况下满足所需的收敛条件(特别是梯度Lipschitz条件)。
更新日期:2021-08-02
down
wechat
bug