当前位置: X-MOL 学术J. Glob. Optim. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Proximal-like incremental aggregated gradient method with Bregman distance in weakly convex optimization problems
Journal of Global Optimization ( IF 1.3 ) Pub Date : 2021-05-29 , DOI: 10.1007/s10898-021-01044-9
Zehui Jia , Jieru Huang , Xingju Cai

We focus on a special nonconvex and nonsmooth composite function, which is the sum of the smooth weakly convex component functions and a proper lower semi-continuous weakly convex function. An algorithm called the proximal-like incremental aggregated gradient (PLIAG) method proposed in Zhang et al. (Math Oper Res 46(1): 61–81, 2021) is proved to be convergent and highly efficient to solve convex minimization problems. This algorithm can not only avoid evaluating the exact full gradient which can be expensive in big data models but also weaken the stringent global Lipschitz gradient continuity assumption on the smooth part of the problem. However, under the nonconvex case, there is few analysis on the convergence of the PLIAG method. In this paper, we prove that the limit point of the sequence generated by the PLIAG method is the critical point of the weakly convex problems. Under further assumption that the objective function satisfies the Kurdyka–Łojasiewicz (KL) property, we prove that the generated sequence converges globally to a critical point of the problem. Additionally, we give the convergence rate when the Łojasiewicz exponent is known.



中文翻译:

弱凸优化问题中具有Bregman距离的类近端增量聚合梯度方法

我们关注一个特殊的非凸和非光滑复合函数,它是光滑弱凸分量函数和适当的下半连续弱凸函数的总和。Zhang等人提出的一种称为类近端增量聚合梯度(PLIAG)方法的算法。(Math Oper Res 46(1): 61–81, 2021) 被证明是收敛的并且非常有效地解决凸最小化问题。该算法不仅可以避免在大数据模型中评估精确的全梯度,而且可以削弱对问题平滑部分的严格全局 Lipschitz 梯度连续性假设。但是,在非凸情况下,对PLIAG方法的收敛性分析很少。在本文中,我们证明了 PLIAG 方法生成的序列的极限点是弱凸问题的临界点。在进一步假设目标函数满足 Kurdyka-Łojasiewicz (KL) 属性的情况下,我们证明生成的序列全局收敛到问题的临界点。此外,当 Łojasiewicz 指数已知时,我们给出了收敛速度。

更新日期:2021-05-30
down
wechat
bug