当前位置: X-MOL 学术Optimization › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Error bound conditions and convergence of optimization methods on smooth and proximally smooth manifolds
Optimization ( IF 2.2 ) Pub Date : 2020-09-07 , DOI: 10.1080/02331934.2020.1812066
M. V. Balashov 1 , A. A. Tremba 1
Affiliation  

We analyse the convergence of the gradient projection algorithm, which is finalized with the Newton method, to a stationary point for the problem of nonconvex constrained optimization minxSf(x) with a proximally smooth set S={xRn:g(x)=0},g:RnRm and a smooth function f. We propose new Error bound (EB) conditions for the gradient projection method which lead to the convergence domain of the Newton method. We prove that these EB conditions are typical for a wide class of optimization problems. It is possible to reach high convergence rate of the algorithm by switching to the Newton method.



中文翻译:

光滑和近端光滑流形上优化方法的误差界限条件和收敛性

我们分析了梯度投影算法的收敛性,该算法是用牛顿法最终确定的,对于非凸约束优化问题的静止点分钟X小号F(X)具有近端平滑集小号={XRnG(X)=0},GRnR和一个平滑函数f。我们为梯度投影方法提出了新的误差界 (EB) 条件,该条件导致牛顿方法的收敛域。我们证明了这些 EB 条件对于广泛的优化问题是典型的。通过切换到牛顿法可以达到算法的高收敛速度。

更新日期:2020-09-07
down
wechat
bug