当前位置: X-MOL 学术Comput. Appl. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
An alternating direction method of multipliers with the BFGS update for structured convex quadratic optimization
Computational and Applied Mathematics ( IF 2.5 ) Pub Date : 2021-03-11 , DOI: 10.1007/s40314-021-01467-w
Yan Gu , Nobuo Yamashita

The alternating direction method of multipliers (ADMM) is an effective method for solving convex problems from a wide range of fields. At each iteration, the classical ADMM solves two subproblems exactly. However, in many applications, it is expensive or impossible to obtain the exact solutions of the subproblems. To overcome the difficulty, some proximal terms are added to the subproblems. This class of methods typically solves the original subproblem approximately and hence requires more iterations. This fact urges us to consider that a special proximal term can yield better results than the classical ADMM. In this paper, we propose a proximal ADMM whose regularization matrix in the proximal term is generated by the BFGS update (or limited memory BFGS) at every iteration. These types of matrices use second-order information of the objective function. The convergence of the proposed method is proved under certain assumptions. Numerical results are presented to demonstrate the effectiveness of the proposed proximal ADMM.



中文翻译:

结构凸二次优化的BFGS更新乘子的交替方向法。

乘数交变方向法(ADMM)是解决广泛领域中凸问题的有效方法。在每次迭代中,传统的ADMM都会精确地解决两个子问题。然而,在许多应用中,获得子问题的精确解是昂贵的或不可能的。为了克服困难,将一些近端项添加到子问题中。这类方法通常可以近似解决原始子问题,因此需要更多的迭代。这个事实促使我们考虑一个特殊的近端项可以比传统的ADMM产生更好的结果。在本文中,我们提出了一个近端ADMM,其近端项中的正则化矩阵由BFGS更新(或有限内存BFGS)在每次迭代时生成。这些类型的矩阵使用目标函数的二阶信息。在某些假设下证明了该方法的收敛性。数值结果表明了所提出的近端ADMM的有效性。

更新日期:2021-03-12
down
wechat
bug