当前位置: X-MOL 学术IEEE Trans. Signal Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Leveraging Joint-Diagonalization in Transform-Learning NMF
IEEE Transactions on Signal Processing ( IF 5.4 ) Pub Date : 2022-07-04 , DOI: 10.1109/tsp.2022.3188177
Sixin Zhang 1 , Emmanuel Soubies 1 , Cedric Fevotte 1
Affiliation  

Non-negative matrix factorization with transform learning (TL-NMF) is a recent idea that aims at learning data representations suited to NMF. In this work, we relate TL-NMF to the classical matrix joint-diagonalization (JD) problem. We show that, when the number of data realizations is sufficiently large, TL-NMF can be replaced by a two-step approach—termed as JD+NMF—that estimates the transform through JD, prior to NMF computation. In contrast, we found that when the number of data realizations is limited, not only is JD+NMF no longer equivalent to TL-NMF, but the inherent low-rank constraint of TL-NMF turns out to be an essential ingredient to learn meaningful transforms for NMF.

中文翻译:

在变换学习 NMF 中利用联合对角化

带有变换学习的非负矩阵分解 (TL-NMF) 是一个最近的想法,旨在学习适合 NMF 的数据表示。在这项工作中,我们将 TL-NMF 与经典的矩阵联合对角化 (JD) 问题联系起来。我们表明,当数据实现的数量足够大时,TL-NMF 可以被称为 JD+NMF 的两步方法代替,该方法在 NMF 计算之前通过 JD 估计变换。相比之下,我们发现当数据实现的数量有限时,不仅 JD+NMF 不再等同于 TL-NMF,而且 TL-NMF 固有的低秩约束成为学习有意义的基本要素NMF 的转换。
更新日期:2022-07-04
down
wechat
bug