当前位置: X-MOL 学术Appl. Comput. Harmon. Anal. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
High-order approximation rates for shallow neural networks with cosine and ReLUk activation functions
Applied and Computational Harmonic Analysis ( IF 2.5 ) Pub Date : 2021-12-21 , DOI: 10.1016/j.acha.2021.12.005
Jonathan W. Siegel 1 , Jinchao Xu 1
Affiliation  

We study the approximation properties of shallow neural networks with an activation function which is a power of the rectified linear unit. Specifically, we consider the dependence of the approximation rate on the dimension and the smoothness in the spectral Barron space of the underlying function f to be approximated. We show that as the smoothness index s of f increases, shallow neural networks with ReLUk activation function obtain an improved approximation rate up to a best possible rate of O(n(k+1)log(n)) in L2, independent of the dimension d. The significance of this result is that the activation function ReLUk is fixed independent of the dimension, while for classical methods the degree of polynomial approximation or the smoothness of the wavelets used would have to increase in order to take advantage of the dimension dependent smoothness of f. In addition, we derive improved approximation rates for shallow neural networks with cosine activation function on the spectral Barron space. Finally, we prove lower bounds showing that the approximation rates attained are optimal under the given assumptions.



中文翻译:

具有余弦和 ReLUk 激活函数的浅层神经网络的高阶逼近率

我们研究了具有激活函数的浅层神经网络的近似特性,该函数是整流线性单元的幂。具体来说,我们考虑逼近率对维度的依赖性以及要逼近的基础函数f的谱巴伦空间中的平滑度。我们表明,随着f的平滑度指数s的增加,具有 ReLU k激活函数的浅层神经网络获得了改进的近似率,最高可达(n-(+1)日志(n))2,与维度d无关。这个结果的意义在于激活函数 ReLU k是固定的,与维度无关,而对于经典方法,多项式逼近的程度或所用小波的平滑度必须增加,以利用维度相关的平滑度f . 此外,我们在谱 Barron 空间上推导出具有余弦激活函数的浅层神经网络的改进逼近率。最后,我们证明了下界,表明在给定的假设下获得的近似率是最佳的。

更新日期:2021-12-24
down
wechat
bug