当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Regularized least squares locality preserving projections with applications to image recognition.
Neural Networks ( IF 6.0 ) Pub Date : 2020-05-21 , DOI: 10.1016/j.neunet.2020.05.023
Wei Wei 1 , Hua Dai 1 , Weitai Liang 2
Affiliation  

Locality preserving projection (LPP), as a well-known technique for dimensionality reduction, is designed to preserve the local structure of the original samples which usually lie on a low-dimensional manifold in the real world. However, it suffers from the undersampled or small-sample-size problem, when the dimension of the features is larger than the number of samples which causes the corresponding generalized eigenvalue problem to be ill-posed. To address this problem, we show that LPP is equivalent to a multivariate linear regression under a mild condition, and establish the connection between LPP and a least squares problem with multiple columns on the right-hand side. Based on the developed connection, we propose two regularized least squares methods for solving LPP. Experimental results on real-world databases illustrate the performance of our methods.



中文翻译:

正则化最小二乘局部性保留投影及其在图像识别中的应用。

局部性保留投影(LPP)是一种众所周知的降维技术,旨在保留原始样本的局部结构,该样本通常位于现实世界中的低维流形上。但是,当特征的维数大于样本数量时,会遭受欠采样或小样本大小的问题,这会使相应的广义特征值问题不适当地解决。为了解决这个问题,我们证明了LPP在温和的条件下等效于多元线性回归,并建立了LPP与最小二乘问题的联系,并在右侧有多个列。基于已开发的连接,我们提出了两种求解LPP的正则化最小二乘法。

更新日期:2020-05-21
down
wechat
bug