当前位置: X-MOL 学术J. Math. Imaging Vis. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Bregman Itoh–Abe Methods for Sparse Optimisation
Journal of Mathematical Imaging and Vision ( IF 2 ) Pub Date : 2020-02-03 , DOI: 10.1007/s10851-020-00944-x
Martin Benning , Erlend Skaldehaug Riis , Carola-Bibiane Schönlieb

In this paper we propose optimisation methods for variational regularisation problems based on discretising the inverse scale space flow with discrete gradient methods. Inverse scale space flow generalises gradient flows by incorporating a generalised Bregman distance as the underlying metric. Its discrete-time counterparts, Bregman iterations and linearised Bregman iterations are popular regularisation schemes for inverse problems that incorporate a priori information without loss of contrast. Discrete gradient methods are tools from geometric numerical integration for preserving energy dissipation of dissipative differential systems. The resultant Bregman discrete gradient methods are unconditionally dissipative and achieve rapid convergence rates by exploiting structures of the problem such as sparsity. Building on previous work on discrete gradients for non-smooth, non-convex optimisation, we prove convergence guarantees for these methods in a Clarke subdifferential framework. Numerical results for convex and non-convex examples are presented.

中文翻译:

Bregman Itoh–Abe稀疏优化方法

在本文中,我们提出了基于离散梯度方法离散逆尺度空间流的变分正则化问题的优化方法。逆尺度空间流通过合并广义的Bregman距离作为基础度量来广义化梯度流。它的离散时间对应物,Bregman迭代和线性化的Bregman迭代是反问题的流行正则化方案,这些反问题结合了先验信息而不会损失对比度。离散梯度方法是几何数值积分的工具,用于保持耗散微分系统的能量耗散。所得的Bregman离散梯度方法是无条件耗散的,并且通过利用诸如稀疏性之类的问题的结构来实现快速收敛。基于先前用于非平滑,非凸优化的离散梯度的工作,我们证明了这些方法在Clarke次微分框架中的收敛性保证。给出了凸和非凸示例的数值结果。
更新日期:2020-02-03
down
wechat
bug