当前位置: X-MOL 学术Optim. Methods Softw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Stochastic proximal linear method for structured non-convex problems
Optimization Methods & Software ( IF 2.2 ) Pub Date : 2020-04-29 , DOI: 10.1080/10556788.2020.1754413
Tamir Hazan 1 , Shoham Sabach 1 , Sergey Voldman 1
Affiliation  

In this work, motivated by the challenging task of learning a deep neural network, we consider optimization problems that consist of minimizing a finite-sum of non-convex and non-smooth functions, where the non-smoothness appears as the maximum of non-convex functions with Lipschitz continuous gradient. Due to the large size of the sum, in practice, we focus here on stochastic first-order methods and propose the Stochastic Proximal Linear Method (SPLM) that is based on minimizing an appropriate majorizer at each iteration and is guaranteed to almost surely converge to a critical point of the objective function, where we also proves its convergence rate in finding critical points.



中文翻译:

结构随机非凸问题的随机近端线性方法

在这项工作中,受学习深度神经网络这一艰巨任务的激励,我们考虑了优化问题,包括最小化非凸和非光滑函数的有限和,其中非光滑性表现为非光滑性的最大值。 Lipschitz连续梯度的凸函数。由于总和很大,因此在实践中,我们将重点放在随机一阶方法上,并提出基于随机近邻线性方法(SPLM)的方法,该方法基于在每次迭代中最小化一个合适的主化器,并保证几乎可以收敛到目标函数的关键点,我们还证明了它在找到关键点时的收敛速度。

更新日期:2020-04-29
down
wechat
bug