当前位置: X-MOL 学术J. Math. Imaging Vis. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Big in Japan: Regularizing Networks for Solving Inverse Problems.
Journal of Mathematical Imaging and Vision ( IF 1.3 ) Pub Date : 2019-10-03 , DOI: 10.1007/s10851-019-00911-1
Johannes Schwab 1 , Stephan Antholzer 1 , Markus Haltmeier 1
Affiliation  

Deep learning and (deep) neural networks are emerging tools to address inverse problems and image reconstruction tasks. Despite outstanding performance, the mathematical analysis for solving inverse problems by neural networks is mostly missing. In this paper, we introduce and rigorously analyze families of deep regularizing neural networks (RegNets) of the form \(\mathbf {B}_\alpha + \mathbf {N}_{\theta (\alpha )} \mathbf {B}_\alpha \), where \(\mathbf {B}_\alpha \) is a classical regularization and the network \(\mathbf {N}_{\theta (\alpha )} \mathbf {B}_\alpha \) is trained to recover the missing part \({\text {Id}}_X - \mathbf {B}_\alpha \) not found by the classical regularization. We show that these regularizing networks yield a convergent regularization method for solving inverse problems. Additionally, we derive convergence rates (quantitative error estimates) assuming a sufficient decay of the associated distance function. We demonstrate that our results recover existing convergence and convergence rates results for filter-based regularization methods as well as the recently introduced null space network as special cases. Numerical results are presented for a tomographic sparse data problem, which clearly demonstrate that the proposed RegNets improve classical regularization as well as the null space network.

中文翻译:

日本大事:正则化网络解决逆问题。

深度学习和(深度)神经网络是解决逆问题和图像重建任务的新兴工具。尽管性能出色,但神经网络解决反问题的数学分析却大多缺失。在本文中,我们介绍并严格分析了 \(\mathbf {B}_\alpha + \mathbf {N}_{\theta (\alpha )} \mathbf {B 形式的深度正则化神经网络 (RegNet)系列}_\alpha \),其中\(\mathbf {B}_\alpha \)是经典正则化,网络\(\mathbf {N}_{\theta (\alpha )} \mathbf {B}_\ alpha \)被训练以恢复经典正则化未找到的缺失部分\({\text {Id}}_X - \mathbf {B}_\alpha \) 。我们证明这些正则化网络产生了一种用于解决逆问题的收敛正则化方法。此外,假设相关距离函数有足够的衰减,我们得出收敛率(定量误差估计)。我们证明我们的结果恢复了基于滤波器的正则化方法以及最近引入的零空间网络作为特殊情况的现有收敛和收敛率结果。针对断层扫描稀疏数据问题给出了数值结果,清楚地表明所提出的 RegNet 改进了经典正则化以及零空间网络。
更新日期:2019-10-03
down
wechat
bug