当前位置: X-MOL 学术Numer. Algor. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
Numerical Algorithms ( IF 1.7 ) Pub Date : 2020-10-28 , DOI: 10.1007/s11075-020-01017-1
Ting Zhao , Hongwei Liu , Zexian Liu

In this paper, two new subspace minimization conjugate gradient methods based on p-regularization models are proposed, where a special scaled norm in p-regularization model is analyzed. Different choices of special scaled norm lead to different solutions to the p-regularized subproblem. Based on the analyses of the solutions in a two-dimensional subspace, we derive new directions satisfying the sufficient descent condition. With a modified nonmonotone line search, we establish the global convergence of the proposed methods under mild assumptions. R-linear convergence of the proposed methods is also analyzed. Numerical results show that, for the CUTEr library, the proposed methods are superior to four conjugate gradient methods, which were proposed by Hager and Zhang (SIAM J. Optim. 16(1):170–192, 2005), Dai and Kou (SIAM J. Optim. 23(1):296–320, 2013), Liu and Liu (J. Optim. Theory. Appl. 180(3):879–906, 2019) and Li et al. (Comput. Appl. Math. 38(1):2019), respectively.



中文翻译:

基于正则化模型的子空间最小化共轭梯度法无约束优化

在本文中,两个新的子空间最小化的基于共轭梯度法p -regularization模型提出,凡在一个特殊的缩放规范p -regularization模型进行了分析。特殊比例范数的不同选择导致对p正则化子问题的不同解决方案。根据对二维子空间中解的分析,我们得出了满足充分下降条件的新方向。通过修改的非单调线搜索,我们在温和的假设下建立了所提出方法的全局收敛性。[R还分析了所提出方法的线性收敛性。数值结果表明,对于CUTEr库,所提出的方法优于由Hager和Zhang(SIAM J. Optim。16(1):170–192,2005),Dai和Kou(2005)提出的四种共轭梯度方法SIAM J. Optim。23(1):296-320,2013),Liu和Liu(J. Optim。Theory。Appl。180(3):879-906,2019)和Li等。(计算应用数学38(1):2019)。

更新日期:2020-10-30
down
wechat
bug