当前位置: X-MOL 学术IEEE Trans. Signal Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Quickly Finding the Best Linear Model in High Dimensions via Projected Gradient Descent
IEEE Transactions on Signal Processing ( IF 5.4 ) Pub Date : 2020-01-01 , DOI: 10.1109/tsp.2020.2964216
Yahya Sattar , Samet Oymak

We study the problem of finding the best linear model that can minimize least-squares loss given a dataset. While this problem is trivial in the low-dimensional regime, it becomes more interesting in high-dimensions where the population minimizer is assumed to lie on a manifold such as sparse vectors. We propose projected gradient descent (PGD) algorithm to estimate the population minimizer in the finite sample regime. We establish linear convergence rate and data-dependent estimation error bounds for PGD. Our contributions include: 1) The results are established for heavier tailed subexponential distributions besides subgaussian and allows for an intercept term. 2) We directly analyze the empirical risk minimization and do not require a realizable model that connects input data and labels. The numerical experiments validate our theoretical results.

中文翻译:

通过投影梯度下降快速找到高维的最佳线性模型

我们研究了在给定数据集的情况下找到可以最小化最小二乘损失的最佳线性模型的问题。虽然这个问题在低维情况下是微不足道的,但在高维情况下它变得更有趣,其中假设种群最小化位于诸如稀疏向量之类的流形上。我们提出了投影梯度下降 (PGD) 算法来估计有限样本范围内的总体极小值。我们为 PGD 建立线性收敛率和数据相关的估计误差界限。我们的贡献包括:1) 结果是为除亚高斯分布之外的重尾亚指数分布建立的,并允许截距项。2)我们直接分析经验风险最小化,不需要连接输入数据和标签的可实现模型。
更新日期:2020-01-01
down
wechat
bug