当前位置: X-MOL 学术Optim. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
On the superiority of PGMs to PDCAs in nonsmooth nonconvex sparse regression
Optimization Letters ( IF 1.6 ) Pub Date : 2021-03-03 , DOI: 10.1007/s11590-021-01716-1
Shummin Nakayama , Jun-ya Gotoh

This paper conducts a comparative study of proximal gradient methods (PGMs) and proximal DC algorithms (PDCAs) for sparse regression problems which can be cast as Difference-of-two-Convex-functions (DC) optimization problems. It has been shown that for DC optimization problems, both General Iterative Shrinkage and Thresholding algorithm (GIST), a modified version of PGM, and PDCA converge to critical points. Recently some enhanced versions of PDCAs are shown to converge to d-stationary points, which are stronger necessary condition for local optimality than critical points. In this paper we claim that without any modification, PGMs converge to a d-stationary point not only to DC problems but also to more general nonsmooth nonconvex problems under some technical assumptions. While the convergence to d-stationary points is known for the case where the step size is small enough, the finding of this paper is valid also for extended versions such as GIST and its alternating optimization version, which is to be developed in this paper. Numerical results show that among several algorithms in the two categories, modified versions of PGM perform best among those not only in solution quality but also in computation time.



中文翻译:

在非光滑非凸稀疏回归中PGM对PDCA的优越性

本文对稀疏回归问题的近端梯度方法(PGM)和近端DC算法(PDCA)进行了比较研究,这些问题可以看作是二凸函数差异(DC)优化问题。已经表明,对于DC优化问题,通用迭代收缩和阈值算法(GIST),PGM的修改版本和PDCA都收敛到临界点。最近,PDCA的一些增强版本显示收敛于d平稳点,相对于临界点,d平稳点是局部最优的更强必要条件。在本文中,我们声称不作任何修改,在某些技术假设下,PGM不仅收敛到DC问题,而且收敛到d平稳点,而且收敛到更一般的非光滑非凸问题。虽然对于步长足够小的情况,已知到d平稳点的收敛性,但本文的发现对于扩展版本(如GIST及其替代优化版本)也是有效的,GIST及其替代优化版本将在本文中进行开发。数值结果表明,在这两类算法中,PGM的修改版本在解决方案质量和计算时间上均表现最佳。

更新日期:2021-03-03
down
wechat
bug