当前位置: X-MOL 学术Constr. Approx. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Theoretical Analysis of Deep Neural Networks and Parametric PDEs
Constructive Approximation ( IF 2.3 ) Pub Date : 2021-06-02 , DOI: 10.1007/s00365-021-09551-4
Gitta Kutyniok , Philipp Petersen , Mones Raslan , Reinhold Schneider

We derive upper bounds on the complexity of ReLU neural networks approximating the solution maps of parametric partial differential equations. In particular, without any knowledge of its concrete shape, we use the inherent low dimensionality of the solution manifold to obtain approximation rates which are significantly superior to those provided by classical neural network approximation results. Concretely, we use the existence of a small reduced basis to construct, for a large variety of parametric partial differential equations, neural networks that yield approximations of the parametric solution maps in such a way that the sizes of these networks essentially only depend on the size of the reduced basis.



中文翻译:

深度神经网络和参数偏微分方程的理论分析

我们推导出近似参数偏微分方程解图的 ReLU 神经网络复杂度的上限。特别是,在不知道其具体形状的情况下,我们使用解流形固有的低维来获得近似率,该近似率明显优于经典神经网络近似结果提供的近似率。具体来说,我们利用一个小的简化基的存在来构造大量参数偏微分方程的神经网络,这些神经网络产生参数解映射的近似值,使得这些网络的大小基本上只取决于大小减少的基础。

更新日期:2021-06-02
down
wechat
bug