当前位置: X-MOL 学术Inverse Probl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A globally convergent algorithm for a class of gradient compounded non-Lipschitz models applied to non-additive noise removal
Inverse Problems ( IF 2.1 ) Pub Date : 2020-12-09 , DOI: 10.1088/1361-6420/abc793
Zhe Zheng 1 , Michael Ng 2 , Chunlin Wu 1
Affiliation  

Non-Lipschitz regularization has got much attention in image restoration with additive noise removal recently, which can preserve neat edges in the restored image. In this paper, we consider a class of minimization problems with gradient compounded non-Lipschitz regularization applied to non-additive noise removal, with Poisson and multiplicative one as examples. The existence of a solution of the general model is discussed. We also extend the recent iterative support shrinkage strategy to give an algorithm to minimize it, where the subproblem at each iteration is allowed to be solved inexactly. Moreover, this paper is the first one to give the subdifferential of the gradient compounded non-Lipschitz regularization term, based on which we are able to establish the global convergence of the iterative sequence to a stationary point of the original objective function. This is, to our best knowledge, stronger than all the convergence results for gradient compounded non-Lipschitz minimization problems in the current published literature. Numerical experiments show that our proposed method performs well.



中文翻译:

一类用于非加性噪声去除的梯度混合非Lipschitz模型的全局收敛算法

最近,非Lipschitz正则化在图像恢复中已引起广泛关注,该方法通过去除附加噪声,可以在恢复的图像中保留整齐的边缘。在本文中,我们考虑了将梯度复合非Lipschitz正则化应用于非加性噪声消除的一类最小化问题,以Poisson和乘性为例。讨论了通用模型解的存在性。我们还扩展了最近的迭代支持收缩策略,以提供一种算法来最小化该算法,从而可以精确地解决每次迭代中的子问题。此外,本文是第一个给出梯度复合非Lipschitz正则化项的次微分的论文,基于此,我们能够建立迭代序列到原始目标函数的固定点的全局收敛。据我们所知,这比当前已发表文献中梯度复合非Lipschitz极小化问题的所有收敛结果都强。数值实验表明,该方法性能良好。

更新日期:2020-12-09
down
wechat
bug