当前位置: X-MOL 学术Inverse Probl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Bilevel Optimization, Deep Learning and Fractional Laplacian Regularization with Applications in Tomography
Inverse Problems ( IF 2.0 ) Pub Date : 2020-05-15 , DOI: 10.1088/1361-6420/ab80d7
Harbir Antil 1 , Zichao Wendy Di 2 , Ratna Khatri 1
Affiliation  

In this work we consider a generalized bilevel optimization framework for solving inverse problems. We introduce fractional Laplacian as a regularizer to improve the reconstruction quality, and compare it with the total variation regularization. We emphasize that the key advantage of using fractional Laplacian as a regularizer is that it leads to a linear operator, as opposed to the total variation regularization which results in a nonlinear degenerate operator. Inspired by residual neural networks, to learn the optimal strength of regularization and the exponent of fractional Laplacian, we develop a dedicated bilevel optimization neural network with a variable depth for a general regularized inverse problem. We also draw some parallels between an activation function in a neural network and regularization. We illustrate how to incorporate various regularizer choices into our proposed network. As an example, we consider tomographic reconstruction as a model problem and show an improvement in reconstruction quality, especially for limited data, via fractional Laplacian regularization. We successfully learn the regularization strength and the fractional exponent via our proposed bilevel optimization neural network. We observe that the fractional Laplacian regularization outperforms total variation regularization. This is specially encouraging, and important, in the case of limited and noisy data.

中文翻译:

双层优化、深度学习和分数拉普拉斯正则化在层析成像中的应用

在这项工作中,我们考虑了用于解决逆问题的广义双​​层优化框架。我们引入分数拉普拉斯算子作为正则化器以提高重建质量,并将其与总变化正则化进行比较。我们强调,使用分数拉普拉斯算子作为正则化器的关键优势在于它导致线性算子,而不是导致非线性退化算子的总变化正则化。受残差神经网络的启发,为了学习正则化的最佳强度和分数拉普拉斯算子的指数,我们为一般正则化逆问题开发了一个具有可变深度的专用双层优化神经网络。我们还在神经网络中的激活函数和正则化之间绘制了一些相似之处。我们说明了如何将各种正则化器选择合并到我们提出的网络中。例如,我们将断层扫描重建视为一个模型问题,并通过分数拉普拉斯正则化展示了重建质量的提高,尤其是对于有限数据。我们通过我们提出的双层优化神经网络成功地学习了正则化强度和分数指数。我们观察到分数拉普拉斯正则化优于全变分正则化。在数据有限且嘈杂的情况下,这尤其令人鼓舞且重要。我们通过我们提出的双层优化神经网络成功地学习了正则化强度和分数指数。我们观察到分数拉普拉斯正则化优于全变分正则化。在数据有限且嘈杂的情况下,这尤其令人鼓舞且重要。我们通过我们提出的双层优化神经网络成功地学习了正则化强度和分数指数。我们观察到分数拉普拉斯正则化优于全变分正则化。在数据有限且嘈杂的情况下,这尤其令人鼓舞且重要。
更新日期:2020-05-15
down
wechat
bug