当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A counterexample regarding "New study on neural networks: The essential order of approximation".
Neural Networks ( IF 7.8 ) Pub Date : 2019-12-18 , DOI: 10.1016/j.neunet.2019.12.007
Steffen Goebbels 1
Affiliation  

The paper "New study on neural networks: the essential order of approximation" by Jianjun Wang and Zongben Xu, which appeared in Neural Networks 23 (2010), deals with upper and lower estimates for the error of best approximation with sums of nearly exponential type activation functions in terms of moduli of smoothness. In particular, the presented lower bound is astonishingly good. However, the proof is incorrect and the bound is wrong.

中文翻译:

关于“神经网络的新研究:近似的基本阶数”的反例。

王建军和徐宗本的论文“神经网络的新研究:逼近的基本顺序”(发表于Neural Networks 23(2010))处理了近似指数类型总和的最佳近似误差的上下估计。激活函数的光滑度模量。特别地,提出的下限是惊人的好。但是,证明不正确,界限是错误的。
更新日期:2019-12-18
down
wechat
bug