当前位置: X-MOL 学术J. Sci. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Algorithms for Nonnegative Matrix Factorization with the Kullback–Leibler Divergence
Journal of Scientific Computing ( IF 2.5 ) Pub Date : 2021-05-08 , DOI: 10.1007/s10915-021-01504-0
Le Thi Khanh Hien , Nicolas Gillis

Nonnegative matrix factorization (NMF) is a standard linear dimensionality reduction technique for nonnegative data sets. In order to measure the discrepancy between the input data and the low-rank approximation, the Kullback–Leibler (KL) divergence is one of the most widely used objective function for NMF. It corresponds to the maximum likehood estimator when the underlying statistics of the observed data sample follows a Poisson distribution, and KL NMF is particularly meaningful for count data sets, such as documents. In this paper, we first collect important properties of the KL objective function that are essential to study the convergence of KL NMF algorithms. Second, together with reviewing existing algorithms for solving KL NMF, we propose three new algorithms that guarantee the non-increasingness of the objective function. We also provide a global convergence guarantee for one of our proposed algorithms. Finally, we conduct extensive numerical experiments to provide a comprehensive picture of the performances of the KL NMF algorithms.



中文翻译:

具有Kullback-Leibler散度的非负矩阵分解算法

非负矩阵分解(NMF)是用于非负数据集的标准线性降维技术。为了测量输入数据与低秩近似之间的差异,Kullback-Leibler(KL)散度是NMF中使用最广泛的目标函数之一。当观察到的数据样本的基础统计信息遵循泊松分布时,它对应于最大似然估计,并且KL NMF对于计数数据集(例如文档)特别有意义。在本文中,我们首先收集了KL目标函数的重要属性,这些属性对于研究KL NMF算法的收敛性必不可少。其次,与回顾解决KL NMF的现有算法一起,我们提出了三种新算法来保证目标函数的非递增性。我们还为我们提出的一种算法提供了全局收敛性保证。最后,我们进行了广泛的数值实验,以全面了解KL NMF算法的性能。

更新日期:2021-05-08
down
wechat
bug