当前位置:
X-MOL 学术
›
arXiv.cs.NA
›
论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
ADMM-DIPTV: combining Total Variation and Deep Image Prior for image restoration
arXiv - CS - Numerical Analysis Pub Date : 2020-09-23 , DOI: arxiv-2009.11380 Pasquale Cascarano, Andrea Sebastiani, Maria Colomba Comes
arXiv - CS - Numerical Analysis Pub Date : 2020-09-23 , DOI: arxiv-2009.11380 Pasquale Cascarano, Andrea Sebastiani, Maria Colomba Comes
In the last decades, unsupervised deep learning based methods have caught
researchers attention, since in many applications collecting a great amount of
training examples is not always feasible. Moreover, the construction of a good
training set is time consuming and hard because the selected data have to be
enough representative for the task. In this paper, we mainly focus on the Deep
Image Prior (DIP) framework powered by adding the Total Variation regularizer
which promotes gradient-sparsity of the solution. Differently from other
existing approaches, we solve the arising minimization problem by using the
well known Alternating Direction Method of Multipliers (ADMM) framework,
decoupling the contribution of the DIP $L_{2}$-norm and Total Variation terms.
The promising performances of the proposed approach, in terms of PSNR and SSIM
values, are addressed by means of experiments for different image restoration
tasks on synthetic as well as on real data.
中文翻译:
ADMM-DIPTV:结合 Total Variation 和 Deep Image Prior 进行图像恢复
在过去的几十年中,基于无监督深度学习的方法引起了研究人员的注意,因为在许多应用程序中,收集大量训练示例并不总是可行的。此外,构建一个好的训练集既费时又费力,因为所选数据必须对任务具有足够的代表性。在本文中,我们主要关注深度图像先验 (DIP) 框架,该框架通过添加促进解决方案梯度稀疏性的 Total Variation 正则化器提供支持。与其他现有方法不同,我们通过使用众所周知的乘法器交替方向法 (ADMM) 框架来解决出现的最小化问题,将 DIP $L_{2}$-norm 和总变异项的贡献解耦。所提出的方法在 PSNR 和 SSIM 值方面的有希望的性能,
更新日期:2020-09-25
中文翻译:
ADMM-DIPTV:结合 Total Variation 和 Deep Image Prior 进行图像恢复
在过去的几十年中,基于无监督深度学习的方法引起了研究人员的注意,因为在许多应用程序中,收集大量训练示例并不总是可行的。此外,构建一个好的训练集既费时又费力,因为所选数据必须对任务具有足够的代表性。在本文中,我们主要关注深度图像先验 (DIP) 框架,该框架通过添加促进解决方案梯度稀疏性的 Total Variation 正则化器提供支持。与其他现有方法不同,我们通过使用众所周知的乘法器交替方向法 (ADMM) 框架来解决出现的最小化问题,将 DIP $L_{2}$-norm 和总变异项的贡献解耦。所提出的方法在 PSNR 和 SSIM 值方面的有希望的性能,