当前位置: X-MOL 学术J. Comput. Appl. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Convergence rates for iteratively regularized Gauss–Newton method subject to stability constraints
Journal of Computational and Applied Mathematics ( IF 2.1 ) Pub Date : 2021-07-22 , DOI: 10.1016/j.cam.2021.113744
Gaurav Mittal , Ankik Kumar Giri

In this paper we formulate the convergence rates of the iteratively regularized Gauss–Newton method by defining the iterates via convex optimization problems in a Banach space setting. We employ the concept of conditional stability to deduce the convergence rates in place of the well known concept of variational inequalities. To validate our abstract theory, we also discuss an ill-posed inverse problem that satisfies our assumptions. We also compare our results with the existing results in the literature.



中文翻译:

受稳定性约束的迭代正则化 Gauss-Newton 方法的收敛率

在本文中,我们通过在 Banach 空间设置中通过凸优化问题定义迭代来制定迭代正则化 Gauss-Newton 方法的收敛速度。我们使用条件稳定性的概念来推导收敛率,而不是众所周知的变分不等式概念。为了验证我们的抽象理论,我们还讨论了一个满足我们假设的不适定逆问题。我们还将我们的结果与文献中的现有结果进行比较。

更新日期:2021-08-01
down
wechat
bug