当前位置: X-MOL 学术Appl. Math. Optim. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Gradient Descent Perspective on Sinkhorn
Applied Mathematics and Optimization ( IF 1.8 ) Pub Date : 2020-07-01 , DOI: 10.1007/s00245-020-09697-w
Flavien Léger

We present a new perspective on the popular Sinkhorn algorithm, showing that it can be seen as a Bregman gradient descent (mirror descent) of a relative entropy (Kullback–Leibler divergence). This viewpoint implies a new sublinear convergence rate with a robust constant.



中文翻译:

Sinkhorn的梯度下降观点

我们对流行的Sinkhorn算法提出了一个新的观点,表明它可以看作是相对熵的Bregman梯度下降(镜面下降)(Kullback-Leibler散度)。这种观点暗示了具有鲁棒常数的新的亚线性收敛速率。

更新日期:2020-07-01
down
wechat
bug