当前位置: X-MOL 学术SIAM J. Imaging Sci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Choose Your Path Wisely: Gradient Descent in a Bregman Distance Framework
SIAM Journal on Imaging Sciences ( IF 2.1 ) Pub Date : 2021-06-22 , DOI: 10.1137/20m1357500
Martin Benning , Marta M. Betcke , Matthias J. Ehrhardt , Carola-Bibiane Schönlieb

SIAM Journal on Imaging Sciences, Volume 14, Issue 2, Page 814-843, January 2021.
We propose an extension of a special form of gradient descent---in the literature known as linearized Bregman iteration---to a larger class of nonconvex functions. We replace the classical (squared) two norm metric in the gradient descent setting with a generalized Bregman distance, based on a proper, convex, and lower semicontinuous function. The algorithm's global convergence is proven for functions that satisfy the Kurdyka--Łojasiewicz property. Examples illustrate that features of different scale are being introduced throughout the iteration, transitioning from coarse to fine. This coarse-to-fine approach with respect to scale allows us to recover solutions of nonconvex optimization problems that are superior to those obtained with conventional gradient descent, or even projected and proximal gradient descent. The effectiveness of the linearized Bregman iteration in combination with early stopping is illustrated for the applications of parallel magnetic resonance imaging, blind deconvolution, as well as image classification with neural networks.


中文翻译:

明智地选择你的路径:Bregman 距离框架中的梯度下降

SIAM 成像科学杂志,第 14 卷,第 2 期,第 814-843 页,2021 年 1 月。
我们建议将一种特殊形式的梯度下降——在文献中称为线性化布雷格曼迭代——扩展到更大的非凸函数类。我们基于适当的、凸的和较低的半连续函数,用广义 Bregman 距离替换梯度下降设置中的经典(平方)二范数度量。对于满足 Kurdyka--Łojasiewicz 性质的函数,该算法的全局收敛性得到了证明。示例说明在整个迭代过程中引入了不同规模的特征,从粗到细过渡。这种从粗到细的规模方法使我们能够恢复非凸优化问题的解决方案,这些解决方案优于使用传统梯度下降或什至投影和近端梯度下降获得的解决方案。
更新日期:2021-06-22
down
wechat
bug