当前位置: X-MOL 学术Mediterr. J. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Asymptotic Expansion for Neural Network Operators of the Kantorovich Type and High Order of Approximation
Mediterranean Journal of Mathematics ( IF 1.1 ) Pub Date : 2021-02-23 , DOI: 10.1007/s00009-021-01717-5
Marco Cantarini , Danilo Costarelli , Gianluca Vinti

In this paper, we study the rate of pointwise approximation for the neural network operators of the Kantorovich type. This result is obtained proving a certain asymptotic expansion for the above operators and then by establishing a Voronovskaja type formula. A central role in the above resuts is played by the truncated algebraic moments of the density functions generated by suitable sigmoidal functions. Furthermore, to improve the rate of convergence, we consider finite linear combinations of the above neural network type operators, and also in the latter case, we obtain a Voronovskaja type theorem. Finally, concrete examples of sigmoidal activation functions have been deeply discussed, together with the case of rectified linear unit (ReLu) activation function, very used in connection with deep neural networks.



中文翻译:

Kantorovich类型和高阶逼近的神经网络算子的渐近展开

在本文中,我们研究了Kantorovich型神经网络算子的逐点逼近率。得到的结果证明了上述算子的一定渐近展开性,然后建立了Voronovskaja型公式。在上述结果中的中心作用是由适当的S形函数生成的密度函数的截短的代数矩。此外,为了提高收敛速度,我们考虑上述神经网络类型算子的有限线性组合,并且在后一种情况下,我们获得了Voronovskaja型定理。最后,深入讨论了S形激活函数的具体示例,以及与线性神经元结合使用的整流线性单元(ReLu)激活函数的情况。

更新日期:2021-02-24
down
wechat
bug