当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Approximation rates for neural networks with general activation functions.
Neural Networks ( IF 6.0 ) Pub Date : 2020-05-18 , DOI: 10.1016/j.neunet.2020.05.019
Jonathan W Siegel 1 , Jinchao Xu 1
Affiliation  

We prove some new results concerning the approximation rate of neural networks with general activation functions. Our first result concerns the rate of approximation of a two layer neural network with a polynomially-decaying non-sigmoidal activation function. We extend the dimension independent approximation rates previously obtained to this new class of activation functions. Our second result gives a weaker, but still dimension independent, approximation rate for a larger class of activation functions, removing the polynomial decay assumption. This result applies to any bounded, integrable activation function. Finally, we show that a stratified sampling approach can be used to improve the approximation rate for polynomially decaying activation functions under mild additional assumptions.



中文翻译:

具有一般激活函数的神经网络的近似率。

我们证明了有关具有一般激活函数的神经网络的逼近率的一些新结果。我们的第一个结果涉及具有多项式衰减非S型激活函数的两层神经网络的近似速率。我们将先前获得的与尺寸无关的近似率扩展到这一类新的激活函数。我们的第二个结果给出了较大的激活函数类的较弱但仍与尺寸无关的近似率,从而消除了多项式衰减假设。此结果适用于任何有界,可积分的激活函数。最后,我们表明,在温和的附加假设下,分层抽样方法可用于提高多项式衰减激活函数的近似率。

更新日期:2020-05-18
down
wechat
bug