当前位置: X-MOL 学术J. Sci. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
An Efficient Hessian Based Algorithm for Singly Linearly and Box Constrained Least Squares Regression
Journal of Scientific Computing ( IF 2.8 ) Pub Date : 2021-06-10 , DOI: 10.1007/s10915-021-01541-9
Lanyu Lin , Yong-Jin Liu

The singly linearly and box constrained least squares regression has diverse applications in various fields. This paper builds upon previous work to develop an efficient and robust semismooth Newton based augmented Lagrangian (Ssnal) algorithm for solving this problem, in which a semismooth Newton (Ssn) algorithm with superlinear or even quadratic convergence is applied to solve the subproblems. Theoretically, the global and asymptotically superlinear local convergence of the Ssnal algorithm hold automatically under standard conditions. Computationally, a generalized Jacobian for the projector onto the feasible set is shown to be either diagonal or diagonal-minus-rank-1, which is a key ingredient for the efficiency of the Ssnal algorithm. Numerical experiments conducted on both synthetic and real data sets demonstrate that the Ssnal algorithm compared to several state-of-the-art first-order algorithms is much more efficient and robust.



中文翻译:

一种用于单线性和框约束最小二乘回归的高效 Hessian 算法

单线性和框约束最小二乘回归在各个领域都有不同的应用。本文在先前工作的基础上开发了一种有效且稳健的基于半光滑牛顿的增广拉格朗日 ( Ssnal ) 算法来解决该问题,其中应用具有超线性甚至二次收敛的半光滑牛顿 ( Ssn ) 算法来解决子问题。理论上,Ssnal算法的全局和渐近超线性局部收敛在标准条件下自动成立。在计算上,投影到可行集上的广义雅可比矩阵被证明是对角线或对角线负秩 1,这是Ssnal效率的关键因素算法。在合成数据集和真实数据集上进行的数值实验表明,与几种最先进的一阶算法相比,Ssnal算法更加高效和稳健。

更新日期:2021-06-11
down
wechat
bug