当前位置: X-MOL 学术Set-Valued Var. Anal. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Perturbation Techniques for Convergence Analysis of Proximal Gradient Method and Other First-Order Algorithms via Variational Analysis
Set-Valued and Variational Analysis ( IF 1.3 ) Pub Date : 2021-01-28 , DOI: 10.1007/s11228-020-00570-0
Xiangfeng Wang , Jane J. Ye , Xiaoming Yuan , Shangzhi Zeng , Jin Zhang

Wedevelopnew perturbation techniques for conducting convergence analysis of various first-order algorithms for a class of nonsmooth optimization problems. We consider the iteration scheme of an algorithm to construct a perturbed stationary point set-valued map, and define the perturbing parameter by the difference of two consecutive iterates. Then, we show that the calmness condition of the induced set-valued map, together with a local version of the proper separation of stationary value condition, is a sufficient condition to ensure the linear convergence of the algorithm. The equivalence of the calmness condition to the one for the canonically perturbed stationary point set-valued map is proved, and this equivalence allows us to derive some sufficient conditions for calmness by using some recent developments in variational analysis. These sufficient conditions are different from existing results (especially, those error-bound-based ones) in that they can be easily verified for many concrete application models. Our analysis is focused on the fundamental proximal gradient (PG) method, and it enables us to show that any accumulation of the sequence generated by the PG method must be a stationary point in terms of the proximal subdifferential, instead of the limiting subdifferential. This result finds the surprising fact that the solution quality found by the PG method is in general superior. Our analysis also leads to some improvement for the linear convergence results of the PG method in the convex case. The new perturbation technique can be conveniently used to derive linear rate convergence of a number of other first-order methods including the well-known alternating direction method of multipliers and primal-dual hybrid gradient method, under mild assumptions.



中文翻译:

基于变分分析的近邻梯度法和其他一阶算法收敛性分析的摄动技术

我们开发了新的摄动技术,用于对一类非光滑优化问题进行各种一阶算法的收敛性分析。我们考虑一种算法的迭代方案,以构造一个扰动的静止点集值映射,并通过两个连续迭代的差来定义扰动参数。然后,我们证明了诱导集值映射的镇定条件以及固定值条件的适当分离的局部版本,是确保算法线性收敛的充分条件。证明了镇定条件与典型扰动固定点集值映射的镇定条件的等价关系,并且该等价关系允许我们通过使用变分分析中的一些最新进展来推导出一些足够的镇定条件。这些足够的条件与现有结果(尤其是那些基于错误绑定的结果)不同,因为它们可以轻松地针对许多具体的应用模型进行验证。我们的分析集中在基本近端梯度(PG)方法上,它使我们能够表明,就近端亚微分而言,由PG方法生成的序列的任何累加必须是固定点,而不是极限亚微分。该结果发现了令人惊讶的事实,即通过PG方法发现的溶液质量通常更高。我们的分析还为凸情况下的PG方法的线性收敛结果带来了一些改进。

更新日期:2021-01-28
down
wechat
bug