当前位置: X-MOL 学术J. Optim. Theory Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Level-Set Subdifferential Error Bounds and Linear Convergence of Bregman Proximal Gradient Method
Journal of Optimization Theory and Applications ( IF 1.9 ) Pub Date : 2021-05-31 , DOI: 10.1007/s10957-021-01865-4
Daoli Zhu , Sien Deng , Minghua Li , Lei Zhao

In this work, we develop a level-set subdifferential error bound condition with an eye toward convergence rate analysis of a variable Bregman proximal gradient (VBPG) method for a broad class of nonsmooth and nonconvex optimization problems. It is proved that the aforementioned condition guarantees linear convergence of VBPG and is weaker than Kurdyka–Łojasiewicz property, weak metric subregularity, and Bregman proximal error bound. Along the way, we are able to derive a number of verifiable conditions for level-set subdifferential error bounds to hold, and necessary conditions and sufficient conditions for linear convergence relative to a level set for nonsmooth and nonconvex optimization problems. The newly established results not only enable us to show that any accumulation point of the sequence generated by VBPG is at least a critical point of the limiting subdifferential or even a critical point of the proximal subdifferential with a fixed Bregman function in each iteration, but also provide a fresh perspective that allows us to explore inner-connections among many known sufficient conditions for linear convergence of various first-order methods.



中文翻译:

Bregman近端梯度法的水平集次微分误差界和线性收敛

在这项工作中,我们开发了一个水平集次微分误差边界条件,着眼于对各种非光滑和非凸优化问题的可变 Bregman 近端梯度 (VBPG) 方法的收敛率分析。证明上述条件保证了 VBPG 的线性收敛,并且弱于 Kurdyka-Łojasiewicz 性质、弱度量子正则性和 Bregman 近端误差界。在此过程中,我们能够推导出许多可验证的水平集次微分误差界限保持的条件,以及相对于非光滑和非凸优化问题的水平集线性收敛的必要条件和充分条件。

更新日期:2021-06-01
down
wechat
bug