当前位置: X-MOL 学术SIAM J. Sci. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Efficient and Convergent Preconditioned ADMM for the Potts Models
SIAM Journal on Scientific Computing ( IF 3.1 ) Pub Date : 2021-03-29 , DOI: 10.1137/20m1343956
Hongpeng Sun , Xue-Cheng Tai , Jing Yuan

SIAM Journal on Scientific Computing, Volume 43, Issue 2, Page B455-B478, January 2021.
Nowadays the Potts model works as the key model in a broad spectrum of applications in image processing and computer vision, which can be mathematically formulated in the form of min-cuts and, meanwhile, solved in terms of flow maximizing under the perspective of primal and dual. It is of great interest in developing efficient methods with better algorithmic structures and proved convergence, which is, however, still lost and open for the classical augmented Lagrangian method (ALM)-based approaches. In this work, we propose two novel preconditioned and overrelaxed alternating direction methods of multipliers (ADMMs) with guaranteed convergence, which are based on the classical Eckstein--Bertsekas and Fortin--Glowinski splitting techniques. Particularly, the two new algorithms are essentially accelerated with the proposed preconditioners and overrelaxation schemes. We explore the proposed preconditioned overrelaxed ADMM methods for image segmentation; experiment results demonstrate the proposed methods significantly outperform the classical ALM-based algorithms in terms of superior numerical efficiency along with proved convergence.


中文翻译:

针对Potts模型的高效且收敛的预处理ADMM

SIAM科学计算杂志,第43卷,第2期,第B455-B478页,2021年1月。
如今,Potts模型已成为图像处理和计算机视觉中广泛应用的关键模型,可以通过最小割的形式进行数学公式化,同时,可以在原始和透视的角度下最大程度地解决流量问题。双重的。开发具有更好算法结构和有效收敛性的有效方法引起了人们极大的兴趣,但是,对于基于经典增强拉格朗日方法(ALM)的方法,这仍然是迷茫和开放的。在这项工作中,我们基于经典的Eckstein-Bertsekas和Fortin-Glowinski分裂技术,提出了两种具有保证收敛性的新颖的预条件和过松弛交替方向乘法器(ADMM)。特别,这两种新算法从本质上加速了提出的预处理器和超松弛方案。我们探索提出的预处理过松弛的ADMM方法进行图像分割;实验结果表明,所提出的方法在数值效率和收敛性方面均明显优于传统的基于ALM的算法。
更新日期:2021-03-30
down
wechat
bug