当前位置: X-MOL 学术IEEE Trans. Pattern Anal. Mach. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Large-Scale Low-Rank Matrix Learning with Nonconvex Regularizers.
IEEE Transactions on Pattern Analysis and Machine Intelligence ( IF 23.6 ) Pub Date : 2018-07-20 , DOI: 10.1109/tpami.2018.2858249
Quanming Yao , James T. Kwok , Taifeng Wang , Tie-Yan Liu

Low-rank modeling has many important applications in computer vision and machine learning. While the matrix rank is often approximated by the convex nuclear norm, the use of nonconvex low-rank regularizers has demonstrated better empirical performance. However, the resulting optimization problem is much more challenging. Recent state-of-the-art requires an expensive full SVD in each iteration. In this paper, we show that for many commonly-used nonconvex low-rank regularizers, the singular values obtained from the proximal operator can be automatically threshold. This allows the proximal operator to be efficiently approximated by the power method. We then develop a fast proximal algorithm and its accelerated variant with inexact proximal step. It can be guaranteed that the squared distance between consecutive iterates converges at a rate of $O(1/T)$O(1/T), where $T$T is the number of iterations. Furthermore, we show the proposed algorithm can be parallelized, and the resultant algorithm achieves nearly linear speedup w.r.t. the number of threads. Extensive experiments are performed on matrix completion and robust principal component analysis. Significant speedup over the state-of-the-art is observed.

中文翻译:

具有非凸正则化器的大规模低秩矩阵学习。

低等级建模在计算机视觉和机器学习中具有许多重要的应用。虽然矩阵秩通常由凸核范数来近似,但是使用非凸低秩正则化函数已显示出更好的经验性能。但是,由此产生的优化问题更具挑战性。最近的最新技术在每次迭代中都需要昂贵的完整SVD。在本文中,我们表明,对于许多常用的非凸低秩正则化器,从近端算子获得的奇异值可以自动设为阈值。这允许通过幂方法有效地近似近端操作者。然后,我们开发了一种不精确的近端步骤的快速近端算法及其加速的变体。可以保证连续迭代之间的平方距离以$ O(1 / T)$ O(1 / T)的速率收敛,其中$ T $ T是迭代次数。此外,我们证明了所提出的算法可以并行化,并且所得到的算法在线程数方面实现了近乎线性的加速。在基质完成和稳健的主成分分析方面进行了广泛的实验。可以看到,与现有技术相比,速度明显提高。
更新日期:2019-10-23
down
wechat
bug