当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Error bounds for deep ReLU networks using the Kolmogorov-Arnold superposition theorem.
Neural Networks ( IF 6.0 ) Pub Date : 2020-05-26 , DOI: 10.1016/j.neunet.2019.12.013
Hadrien Montanelli 1 , Haizhao Yang 2
Affiliation  

We prove a theorem concerning the approximation of multivariate functions by deep ReLU networks, for which the curse of the dimensionality is lessened. Our theorem is based on a constructive proof of the Kolmogorov–Arnold superposition theorem, and on a subset of multivariate continuous functions whose outer superposition functions can be efficiently approximated by deep ReLU networks.



中文翻译:

使用Kolmogorov-Arnold叠加定理的深度ReLU网络的误差范围。

我们证明了一个关于通过深层ReLU网络逼近多元函数的定理,从而减少了维数的诅咒。我们的定理基于Kolmogorov–Arnold叠加定理的构造证明,以及多元连续函数的子集,其外部叠加函数可以通过深ReLU网络有效地近似。

更新日期:2020-05-26
down
wechat
bug