当前位置: X-MOL 学术Found. Comput. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Stochastic Subgradient Method Converges on Tame Functions
Foundations of Computational Mathematics ( IF 3 ) Pub Date : 2019-01-07 , DOI: 10.1007/s10208-018-09409-5
Damek Davis , Dmitriy Drusvyatskiy , Sham Kakade , Jason D. Lee

This work considers the question: what convergence guarantees does the stochastic subgradient method have in the absence of smoothness and convexity? We prove that the stochastic subgradient method, on any semialgebraic locally Lipschitz function, produces limit points that are all first-order stationary. More generally, our result applies to any function with a Whitney stratifiable graph. In particular, this work endows the stochastic subgradient method, and its proximal extension, with rigorous convergence guarantees for a wide class of problems arising in data science—including all popular deep learning architectures.

中文翻译:

随机次梯度法收敛于驯服函数

这项工作考虑了一个问题:在没有平滑度和凸度的情况下,随机次梯度方法具有什么收敛性?我们证明了在任何半代数局部Lipschitz函数上的随机次梯度方法都产生了全部为一阶平稳的极限点。一般而言,我们的结果适用于任何带有Whitney可分层图的函数。尤其是,这项工作赋予了随机次梯度方法及其近端扩展性,并为数据科学中出现的各种问题(包括所有流行的深度学习架构)提供了严格的收敛性保证。
更新日期:2019-01-07
down
wechat
bug