当前位置: X-MOL 学术J. Comput. Appl. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
An incremental aggregated proximal ADMM for linearly constrained nonconvex optimization with application to sparse logistic regression problems
Journal of Computational and Applied Mathematics ( IF 2.1 ) Pub Date : 2021-01-12 , DOI: 10.1016/j.cam.2021.113384
Zehui Jia , Jieru Huang , Zhongming Wu

We propose an incremental aggregated proximal alternating direction method of multipliers (IAPADMM) for solving a class of nonconvex optimization problems with linear constraints. The new method inherits the advantages of the classical alternating direction method of multipliers and the incremental aggregated proximal method, which have been well studied for structured optimization problems. With some calm conditions, we prove that any limit point of the sequence generated by IAPADMM is the critical point of the considered problem. Furthermore, when the objective function satisfies the Kurdyka–Łojasiewicz property, we obtain the global convergence of the proposed method. Moreover, some numerical results are reported to illustrate the effectiveness and advantage of the new method.



中文翻译:

用于线性约束非凸优化的增量聚合近端ADMM及其在稀疏逻辑回归问题中的应用

为了解决一类具有线性约束的非凸优化问题,我们提出了一种增量乘积的近端交替方向乘积法(IAPADMM)。新方法继承了经典乘数交替方向方法和增量聚合近端方法的优点,这些优点已针对结构优化问题进行了深入研究。在一些平静的条件下,我们证明IAPADMM生成的序列的任何极限点都是所考虑问题的关键点。此外,当目标函数满足Kurdyka-Łojasiewicz性质时,我们获得了所提出方法的全局收敛性。此外,据报道一些数值结果说明了该方法的有效性和优势。

更新日期:2021-01-24
down
wechat
bug