当前位置: X-MOL 学术J. Sci. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Natural Gradient for Combined Loss Using Wavelets
Journal of Scientific Computing ( IF 2.5 ) Pub Date : 2021-01-07 , DOI: 10.1007/s10915-020-01367-x
Lexing Ying

Natural gradients have been widely used in the optimization of loss functionals over probability space, with important examples such as Fisher–Rao gradient descent for Kullback–Leibler divergence, Wasserstein gradient descent for transport-related functionals, and Mahalanobis gradient descent for quadratic loss functionals. This note considers the situation in which the loss is a convex linear combination of these examples. We propose a new natural gradient algorithm by utilizing compactly supported wavelets to diagonalize approximately the Hessian of the combined loss. Numerical results are included to demonstrate the efficiency of the proposed algorithm.



中文翻译:

小波组合损失的自然梯度

自然梯度已被广泛用于概率空间上的损失泛函优化,例如,Fisher-Rao梯度下降用于Kullback-Leibler散度,Wasserstein梯度下降用于运输相关函数,以及Mahalanobis梯度下降用于二次损失函数。本说明考虑了损耗为这些示例的凸线性组合的情况。我们提出了一种新的自然梯度算法,即利用紧致支持的小波将组合损失的Hessian近似对角化。数值结果表明了该算法的有效性。

更新日期:2021-01-07
down
wechat
bug