当前位置: X-MOL 学术Constr. Approx. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Approximation Spaces of Deep Neural Networks
Constructive Approximation ( IF 2.7 ) Pub Date : 2021-05-05 , DOI: 10.1007/s00365-021-09543-4
Rémi Gribonval , Gitta Kutyniok , Morten Nielsen , Felix Voigtlaender

We study the expressivity of deep neural networks. Measuring a network’s complexity by its number of connections or by its number of neurons, we consider the class of functions for which the error of best approximation with networks of a given complexity decays at a certain rate when increasing the complexity budget. Using results from classical approximation theory, we show that this class can be endowed with a (quasi)-norm that makes it a linear function space, called approximation space. We establish that allowing the networks to have certain types of “skip connections” does not change the resulting approximation spaces. We also discuss the role of the network’s nonlinearity (also known as activation function) on the resulting spaces, as well as the role of depth. For the popular ReLU nonlinearity and its powers, we relate the newly constructed spaces to classical Besov spaces. The established embeddings highlight that some functions of very low Besov smoothness can nevertheless be well approximated by neural networks, if these networks are sufficiently deep.



中文翻译:

深度神经网络的逼近空间

我们研究了深度神经网络的表达能力。通过连接数或神经元数来衡量网络的复杂度,我们考虑了在增加复杂度预算时,具有给定复杂度的网络的最佳近似误差以一定比率衰减的函数类别。使用经典逼近理论的结果,我们证明了此类可以赋予一个(准)范数,该范数使其成为线性函数空间,称为近似空间。我们确定,允许网络具有某些类型的“跳过连接”不会更改所得的近似空间。我们还将讨论网络非线性(也称为激活函数)对所得空间的作用,以及深度的作用。对于流行的ReLU非线性及其功效,我们将新构造的空间与经典的Besov空间相关联。既定的嵌入突出显示,如果神经网络足够深,则Besov平滑度非常低的某些函数仍可以很好地被神经网络近似。

更新日期:2021-05-06
down
wechat
bug