当前位置: X-MOL 学术Comput. Optim. Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Towards an efficient augmented Lagrangian method for convex quadratic programming
Computational Optimization and Applications ( IF 1.6 ) Pub Date : 2019-12-16 , DOI: 10.1007/s10589-019-00161-2
Luís Felipe Bueno , Gabriel Haeser , Luiz-Rafael Santos

Interior point methods have attracted most of the attention in the recent decades for solving large scale convex quadratic programming problems. In this paper we take a different route as we present an augmented Lagrangian method for convex quadratic programming based on recent developments for nonlinear programming. In our approach, box constraints are penalized while equality constraints are kept within the subproblems. The motivation for this approach is that Newton’s method can be efficient for minimizing a piecewise quadratic function. Moreover, since augmented Lagrangian methods do not rely on proximity to the central path, some of the inherent difficulties in interior point methods can be avoided. In addition, a good starting point can be easily exploited, which can be relevant for solving subproblems arising from sequential quadratic programming, in sensitivity analysis and in branch and bound techniques. We prove well-definedness and finite convergence of the method proposed. Numerical experiments on separable strictly convex quadratic problems formulated from the Netlib collection show that our method can be competitive with interior point methods, in particular when a good initial point is available and a second-order Lagrange multiplier update is used.

中文翻译:

面向凸二次规划的高效扩充拉格朗日方法

对于解决大规模凸二次规划问题,最近几十年来,内点方法吸引了大多数注意力。在本文中,我们采用了另一条路线,因为我们基于非线性规划的最新进展,提出了凸二次规划的增强拉格朗日方法。在我们的方法中,框约束受到惩罚,而相等约束则保留在子问题中。这种方法的动机是牛顿方法可以有效地最小化分段二次函数。此外,由于增强的拉格朗日方法不依赖于与中心路径的接近性,因此可以避免内点方法中的某些固有困难。此外,可以轻松利用一个良好的起点,这可能与解决顺序二次编程,敏感性分析以及分支定界技术中产生的子问题有关。我们证明了所提出方法的良好定义性和有限收敛性。用公式求解可分的严格凸二次问题的数值实验。Netlib收集表明,我们的方法可以与内部点方法竞争,特别是当有良好的初始点并且使用了二阶Lagrange乘数更新时。
更新日期:2019-12-16
down
wechat
bug