当前位置: X-MOL 学术arXiv.cs.NA › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Preconditioned Alternating Minimization Framework for Nonconvex and Half Quadratic Optimization
arXiv - CS - Numerical Analysis Pub Date : 2021-07-29 , DOI: arxiv-2107.13755
Shengxiang Deng, Ismail Ben Ayed, Hongpeng Sun

For some typical and widely used non-convex half-quadratic regularization models and the Ambrosio-Tortorelli approximate Mumford-Shah model, based on the Kurdyka-\L ojasiewicz analysis and the recent nonconvex proximal algorithms, we developed an efficient preconditioned framework aiming at the linear subproblems that appeared in the nonlinear alternating minimization procedure. Solving large-scale linear subproblems is always important and challenging for lots of alternating minimization algorithms. By cooperating the efficient and classical preconditioned iterations into the nonlinear and nonconvex optimization, we prove that only one or any finite times preconditioned iterations are needed for the linear subproblems without controlling the error as the usual inexact solvers. The proposed preconditioned framework can provide great flexibility and efficiency for dealing with linear subproblems and guarantee the global convergence of the nonlinear alternating minimization method simultaneously.

中文翻译:

非凸和半二次优化的预处理交替最小化框架

对于一些典型的和广泛使用的非凸半二次正则化模型和 Ambrosio-Tortorelli 近似 Mumford-Shah 模型,基于 Kurdyka-\L ojasiewicz 分析和最近的非凸近端算法,我们开发了一个高效的预处理框架,针对非线性交替最小化过程中出现的线性子问题。解决大规模线性子问题对于许多交替最小化算法来说总是很重要和具有挑战性的。通过将有效和经典的预条件迭代配合到非线性和非凸优化中,我们证明了线性子问题只需要一次或任意有限次预条件迭代,而不像通常的不精确求解器那样控制误差。
更新日期:2021-07-30
down
wechat
bug