当前位置: X-MOL 学术Math. Program. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Adaptive regularization with cubics on manifolds
Mathematical Programming ( IF 2.7 ) Pub Date : 2020-05-13 , DOI: 10.1007/s10107-020-01505-1
Naman Agarwal , Nicolas Boumal , Brian Bullins , Coralia Cartis

Adaptive regularization with cubics (ARC) is an algorithm for unconstrained, non-convex optimization. Akin to the popular trust-region method, its iterations can be thought of as approximate, safe-guarded Newton steps. For cost functions with Lipschitz continuous Hessian, ARC has optimal iteration complexity, in the sense that it produces an iterate with gradient smaller than $\varepsilon$ in $O(1/\varepsilon^{1.5})$ iterations. For the same price, it can also guarantee a Hessian with smallest eigenvalue larger than $-\varepsilon^{1/2}$. In this paper, we study a generalization of ARC to optimization on Riemannian manifolds. In particular, we generalize the iteration complexity results to this richer framework. Our central contribution lies in the identification of appropriate manifold-specific assumptions that allow us to secure these complexity guarantees both when using the exponential map and when using a general retraction. A substantial part of the paper is devoted to studying these assumptions---relevant beyond ARC---and providing user-friendly sufficient conditions for them. Numerical experiments are encouraging.

中文翻译:

流形上三次方的自适应正则化

三次方自适应正则化 (ARC) 是一种用于无约束、非凸优化的算法。类似于流行的信任区域方法,它的迭代可以被认为是近似的、受保护的牛顿步骤。对于具有 Lipschitz 连续 Hessian 的成本函数,ARC 具有最佳迭代复杂度,因为它在 $O(1/\varepsilon^{1.5})$ 迭代中产生梯度小于 $\varepsilon$ 的迭代。同样的价格,它也可以保证最小特征值大于$-\varepsilon^{1/2}$的Hessian。在本文中,我们研究了 ARC 对黎曼流形优化的推广。特别是,我们将迭代复杂度结果推广到这个更丰富的框架。我们的核心贡献在于确定了适当的特定于流形的假设,这使我们能够在使用指数映射和使用一般撤回时确保这些复杂性保证。论文的很大一部分致力于研究这些假设——与 ARC 之外的相关——并为它们提供用户友好的充分条件。数值实验令人鼓舞。
更新日期:2020-05-13
down
wechat
bug