当前位置: X-MOL 学术J. Optim. Theory Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Combining Stochastic Adaptive Cubic Regularization with Negative Curvature for Nonconvex Optimization
Journal of Optimization Theory and Applications ( IF 1.6 ) Pub Date : 2019-12-24 , DOI: 10.1007/s10957-019-01624-6
Seonho Park , Seung Hyun Jung , Panos M. Pardalos

We focus on minimizing nonconvex finite-sum functions that typically arise in machine learning problems. In an attempt to solve this problem, the adaptive cubic-regularized Newton method has shown its strong global convergence guarantees and the ability to escape from strict saddle points. In this paper, we expand this algorithm to incorporating the negative curvature method to update even at unsuccessful iterations. We call this new method Stochastic Adaptive cubic regularization with Negative Curvature (SANC). Unlike the previous method, in order to attain stochastic gradient and Hessian estimators, the SANC algorithm uses independent sets of data points of consistent size over all iterations. It makes the SANC algorithm more practical to apply for solving large-scale machine learning problems. To the best of our knowledge, this is the first approach that combines the negative curvature method with the adaptive cubic-regularized Newton method. Finally, we provide experimental results, including neural networks problems supporting the efficiency of our method.

中文翻译:

将随机自适应三次正则化与负曲率相结合进行非凸优化

我们专注于最小化机器学习问题中通常出现的非凸有限和函数。为了解决这个问题,自适应三次正则化牛顿法显示了其强大的全局收敛性保证和摆脱严格鞍点的能力。在本文中,我们将此算法扩展为结合负曲率方法,即使在不成功的迭代中也能进行更新。我们将这种新方法称为具有负曲率的随机自适应三次正则化 (SANC)。与之前的方法不同,为了获得随机梯度和 Hessian 估计量,SANC 算法在所有迭代中使用大小一致的独立数据点集。它使SANC算法更适用于解决大规模机器学习问题。据我们所知,这是第一种将负曲率法与自适应三次正则化牛顿法相结合的方法。最后,我们提供了实验结果,包括支持我们方法效率的神经网络问题。
更新日期:2019-12-24
down
wechat
bug