当前位置: X-MOL 学术Ricerche mat. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
On sharpness of error bounds for multivariate neural network approximation
Ricerche di Matematica ( IF 1.2 ) Pub Date : 2020-12-23 , DOI: 10.1007/s11587-020-00549-x
Steffen Goebbels

Single hidden layer feedforward neural networks can represent multivariate functions that are sums of ridge functions. These ridge functions are defined via an activation function and customizable weights. The paper deals with best non-linear approximation by such sums of ridge functions. Error bounds are presented in terms of moduli of smoothness. The main focus, however, is to prove that the bounds are best possible. To this end, counterexamples are constructed with a non-linear, quantitative extension of the uniform boundedness principle. They show sharpness with respect to Lipschitz classes for the logistic activation function and for certain piecewise polynomial activation functions. The paper is based on univariate results in Goebbels (Res Math 75(3):1–35, 2020. https://rdcu.be/b5mKH)



中文翻译:

多元神经网络逼近的误差界锐度

单个隐藏层前馈神经网络可以表示作为岭函数之和的多元函数。这些岭函数是通过激活函数和可自定义的权重定义的。本文利用这种脊函数之和来处理最佳非线性逼近。误差范围以平滑度的模数表示。但是,主要重点是证明边界是最好的。为此,构建了带有均匀边界原理的非线性定量扩展的反例。对于逻辑激活函数和某些分段多项式激活函数,它们在Lipschitz类方面表现出敏锐性。本文基于Goebbels中的单变量结果(Res Math 75(3):1-35,2020. https://rdcu.be/b5mKH)

更新日期:2020-12-23
down
wechat
bug