当前位置: X-MOL 学术J. Math. Imaging Vis. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Block-Based Refitting in $$\ell _{12}$$ ℓ 12 Sparse Regularization
Journal of Mathematical Imaging and Vision ( IF 1.3 ) Pub Date : 2020-10-17 , DOI: 10.1007/s10851-020-00993-2
Charles-Alban Deledalle , Nicolas Papadakis , Joseph Salmon , Samuel Vaiter

In many linear regression problems, including ill-posed inverse problems in image restoration, the data exhibit some sparse structures that can be used to regularize the inversion. To this end, a classical path is to use \(\ell _{12}\) block-based regularization. While efficient at retrieving the inherent sparsity patterns of the data—the support—the estimated solutions are known to suffer from a systematical bias. We propose a general framework for removing this artifact by refitting the solution toward the data while preserving key features of its structure such as the support. This is done through the use of refitting block penalties that only act on the support of the estimated solution. Based on an analysis of related works in the literature, we introduce a new penalty that is well suited for refitting purposes. We also present a new algorithm to obtain the refitted solution along with the original (biased) solution for any convex refitting block penalty. Experiments illustrate the good behavior of the proposed block penalty for refitting solutions of total variation and total generalized variation models.



中文翻译:

$$ \ ell _ {12} $$ℓ12稀疏正则化中基于块的调整

在许多线性回归问题中,包括图像恢复中的不适定逆问题,数据表现出一些稀疏结构,可用于规范化反演。为此,经典路径是使用\(\ ell _ {12} \)基于块的正则化。虽然可以有效地检索数据的内在稀疏性模式(即支持),但是估计的解决方案存在系统性偏差。我们提出了一个通用框架,可通过针对数据重新调整解决方案来删除此工件,同时保留其结构的关键特征(如支持)。这是通过使用仅在估算解决方案的支持下起作用的整块罚款来实现的。在对文献中相关作品进行分析的基础上,我们引入了一种非常适合改装目的的新惩罚。我们还提出了一种新算法,可针对任何凸形重组块代价获得与原始(有偏)解决方案一起的重组解决方案。

更新日期:2020-10-17
down
wechat
bug