当前位置: X-MOL 学术J. Fourier Anal. Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Compressed Dictionary Learning
Journal of Fourier Analysis and Applications ( IF 1.2 ) Pub Date : 2020-03-09 , DOI: 10.1007/s00041-020-09738-6
Karin Schnass , Flavio Teixeira

In this paper we show that the computational complexity of the iterative thresholding and K-residual-means (ITKrM) algorithm for dictionary learning can be significantly reduced by using dimensionality-reduction techniques based on the Johnson–Lindenstrauss lemma. The dimensionality reduction is efficiently carried out with the fast Fourier transform. We introduce the iterative compressed-thresholding and K-means (IcTKM) algorithm for fast dictionary learning and study its convergence properties. We show that IcTKM can locally recover an incoherent, overcomplete generating dictionary of \(K\) atoms from training signals of sparsity level \(S\) with high probability. Fast dictionary learning is achieved by embedding the training data and the dictionary into \(m< d\) dimensions, and recovery is shown to be locally stable with an embedding dimension which scales as low as \(m = O(S\log ^4S\log ^3 K)\). The compression effectively shatters the data dimension bottleneck in the computational cost of ITKrM, reducing it by a factor O(m/d). Our theoretical results are complemented with numerical simulations which demonstrate that IcTKM is a powerful, low-cost algorithm for learning dictionaries from high-dimensional data sets.

中文翻译:

压缩字典学习

在本文中,我们表明,使用基于Johnson-Lindenstrauss引理的降维技术,可以显着降低字典学习的迭代阈值和K残差均值(ITKrM)算法的计算复杂性。使用快速傅立叶变换可以有效地进行降维。我们介绍了用于快速字典学习的迭代压缩阈值和K-means(IcTKM)算法,并研究了其收敛性。我们表明,IcTKM可以从稀疏性水平\(S \)的训练信号中以高概率局部恢复\(K \)原子的不连贯,生成不完全的字典。通过将训练数据和字典嵌入\(m <d \)来实现快速的字典学习尺寸,恢复被证明是局部稳定的,嵌入尺寸可缩小至\(m = O(S \ log ^ 4S \ log ^ 3 K)\)。压缩有效地消除了ITKrM的计算成本中的数据维瓶颈,将其降低了Om / d)倍。我们的理论结果得到了数值模拟的补充,数值模拟表明IcTKM是一种功能强大的低成本算法,用于从高维数据集中学习字典。
更新日期:2020-03-09
down
wechat
bug