当前位置: X-MOL 学术Math. Methods Appl. Sci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Construction of feedforward neural networks with simple architectures and approximation abilities
Mathematical Methods in the Applied Sciences ( IF 2.1 ) Pub Date : 2020-09-09 , DOI: 10.1002/mma.6876
Zhixiang Chen 1 , Feilong Cao 2
Affiliation  

The universal approximation property of feedforward neural networks (FNNs) is the basis for all FNNs applications. In almost all existing FNN approximation studies, it is always assumed that the activation function of the network satisfies certain conditions, such as sigmoid or bounded continuity. This paper focuses on building simple architecture FNNs with approximation ability by constructing an activation function. First, for any continuous function defined on [0, 1] and an arbitrary accuracy ϵ > 0, an FNN is constructed, which has only one neuron and a fixed weight 1 and can approximate the function with an accuracy of ϵ. Thereafter, we study the representation of FNNs for polynomial functions by constructing a proper activation function. It is proved that any algebraic polynomial with the degree n can be represented by an FNN. Further, a piecewise function is constructed as an activation function of an FNN such that the FNN represents the famous Bernstein polynomials.

中文翻译:

具有简单架构和逼近能力的前馈神经网络的构建

前馈神经网络(FNN)的通用逼近特性是所有FNN应用的基础。在几乎所有现有的FNN近似研究中,始终假定网络的激活函数满足某些条件,例如S型或有界连续性。本文着重于通过构造激活函数来构建具有逼近能力的简单体系结构FNN。首先,对于在[0,1]上定义的任何连续函数并且任意精度ϵ  > 0,构造一个FNN,它仅具有一个神经元并且权重为1,并且可以将精度近似为ϵ。此后,我们通过构造适当的激活函数来研究FNN在多项式函数中的表示。证明了任何次数为n的代数多项式都可以用FNN表示。此外,分段函数被构造为FNN的激活函数,使得FNN表示著名的伯恩斯坦多项式。
更新日期:2020-09-09
down
wechat
bug