当前位置:
X-MOL 学术
›
arXiv.cs.IT
›
论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Universality Laws for High-Dimensional Learning with Random Features
arXiv - CS - Information Theory Pub Date : 2020-09-16 , DOI: arxiv-2009.07669 Hong Hu and Yue M. Lu
arXiv - CS - Information Theory Pub Date : 2020-09-16 , DOI: arxiv-2009.07669 Hong Hu and Yue M. Lu
We prove a universality theorem for learning with random features. Our result
shows that, in terms of training and generalization errors, the random feature
model with a nonlinear activation function is asymptotically equivalent to a
surrogate Gaussian model with a matching covariance matrix. This settles a
conjecture based on which several recent papers develop their results. Our
method for proving the universality builds on the classical Lindeberg approach.
Major ingredients of the proof include a leave-one-out analysis for the
optimization problem associated with the training process and a central limit
theorem, obtained via Stein's method, for weakly correlated random variables.
中文翻译:
具有随机特征的高维学习的普遍性法则
我们证明了具有随机特征的学习的普遍性定理。我们的结果表明,在训练和泛化误差方面,具有非线性激活函数的随机特征模型渐近等效于具有匹配协方差矩阵的代理高斯模型。这解决了一个猜想,基于该猜想,最近的几篇论文得出了他们的结果。我们证明普遍性的方法建立在经典的林德伯格方法之上。证明的主要成分包括对与训练过程相关的优化问题的留一法分析和通过 Stein 方法获得的弱相关随机变量的中心极限定理。
更新日期:2020-09-17
中文翻译:
具有随机特征的高维学习的普遍性法则
我们证明了具有随机特征的学习的普遍性定理。我们的结果表明,在训练和泛化误差方面,具有非线性激活函数的随机特征模型渐近等效于具有匹配协方差矩阵的代理高斯模型。这解决了一个猜想,基于该猜想,最近的几篇论文得出了他们的结果。我们证明普遍性的方法建立在经典的林德伯格方法之上。证明的主要成分包括对与训练过程相关的优化问题的留一法分析和通过 Stein 方法获得的弱相关随机变量的中心极限定理。