当前位置: X-MOL 学术Math. Meth. Oper. Res. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
Mathematical Methods of Operations Research ( IF 0.9 ) Pub Date : 2019-06-26 , DOI: 10.1007/s00186-019-00674-w
Masoud Ahookhosh

We introduce four accelerated (sub)gradient algorithms (ASGA) for solving several classes of convex optimization problems. More specifically, we propose two estimation sequences majorizing the objective function and develop two iterative schemes for each of them. In both cases, the first scheme requires the smoothness parameter and a Hölder constant, while the second scheme is parameter-free (except for the strong convexity parameter which we set zero if it is not available) at the price of applying a finitely terminated backtracking line search. The proposed algorithms attain the optimal complexity for smooth problems with Lipschitz continuous gradients, nonsmooth problems with bounded variation of subgradients, and weakly smooth problems with Hölder continuous gradients. Further, for strongly convex problems, they are optimal for smooth problems while nearly optimal for nonsmooth and weakly smooth problems. Finally, numerical results for some applications in sparse optimization and machine learning are reported, which confirm the theoretical foundations.

中文翻译:

大规模凸优化的加速一阶方法:强凸下的近似最佳复杂度

我们介绍了四种加速(子)梯度算法(ASGA)来解决几类凸优化问题。更具体地说,我们提出了两个估计序列,这些估计序列主要用于目标函数,并针对每个估计序列开发了两个迭代方案。在这两种情况下,第一种方案都需要平滑度参数和Hölder常数,而第二种方案则是无参数的(强凸性参数除外,如果不可用,则将其设置为零),但以应用有限终止的回溯为代价行搜索。提出的算法对于Lipschitz连续梯度的光滑问题,子梯度有界变化的非光滑问题以及Hölder连续梯度的弱光滑问题都具有最佳的复杂度。此外,对于强凸问题,它们对于平滑问题是最佳的,而对于非平滑和弱平滑问题则几乎是最佳的。最后,报告了在稀疏优化和机器学习中一些应用的数值结果,为理论基础奠定了基础。
更新日期:2019-06-26
down
wechat
bug