当前位置: X-MOL 学术Math. Program. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning
Mathematical Programming ( IF 2.2 ) Pub Date : 2020-04-15 , DOI: 10.1007/s10107-020-01501-5
Jérôme Bolte , Edouard Pauwels

Modern problems in AI or in numerical analysis require nonsmooth approaches with a flexible calculus. We introduce generalized derivatives called conservative fields for which we develop a calculus and provide representation formulas. Functions having a conservative field are called path differentiable: convex, concave, Clarke regular and any semialgebraic Lipschitz continuous functions are path differentiable. Using Whitney stratification techniques for semialgebraic and definable sets, our model provides variational formulas for nonsmooth automatic differentiation oracles, as for instance the famous backpropagation algorithm in deep learning. Our differential model is applied to establish the convergence in values of nonsmooth stochastic gradient methods as they are implemented in practice.

中文翻译:

保守集值域、自动微分、随机梯度方法和深度学习

人工智能或数值分析中的现代问题需要具有灵活微积分的非平滑方法。我们引入了称为保守域的广义导数,我们为其开发了微积分并提供了表示公式。具有保守域的函数称为路径可微:凸函数、凹函数、克拉克正则函数和任何半代数 Lipschitz 连续函数都是路径可微的。使用半代数和可定义集的惠特尼分层技术,我们的模型为非平滑自动微分预言机提供变分公式,例如深度学习中著名的反向传播算法。我们的微分模型用于建立非平滑随机梯度方法在实践中实现的值的收敛性。
更新日期:2020-04-15
down
wechat
bug