当前位置: X-MOL 学术J. Sci. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Numerical Solution of the Parametric Diffusion Equation by Deep Neural Networks
Journal of Scientific Computing ( IF 2.8 ) Pub Date : 2021-06-05 , DOI: 10.1007/s10915-021-01532-w
Moritz Geist , Philipp Petersen , Mones Raslan , Reinhold Schneider , Gitta Kutyniok

We perform a comprehensive numerical study of the effect of approximation-theoretical results for neural networks on practical learning problems in the context of numerical analysis. As the underlying model, we study the machine-learning-based solution of parametric partial differential equations. Here, approximation theory for fully-connected neural networks predicts that the performance of the model should depend only very mildly on the dimension of the parameter space and is determined by the intrinsic dimension of the solution manifold of the parametric partial differential equation. We use various methods to establish comparability between test-cases by minimizing the effect of the choice of test-cases on the optimization and sampling aspects of the learning problem. We find strong support for the hypothesis that approximation-theoretical effects heavily influence the practical behavior of learning problems in numerical analysis. Turning to practically more successful and modern architectures, at the end of this study we derive improved error bounds by focusing on convolutional neural networks.



中文翻译:

深度神经网络参数扩散方程的数值解

我们在数值分析的背景下对神经网络的近似理论结果对实际学习问题的影响进行了全面的数值研究。作为基础模型,我们研究了基于机器学习的参数偏微分方程解。在这里,全连接神经网络的近似理论预测模型的性能应该只非常温和地依赖于参数空间的维数,并且由参数偏微分方程的解流形的内在维数决定。我们使用各种方法通过最小化测试用例选择对学习问题的优化和采样方面的影响来建立测试用例之间的可比性。我们发现近似理论效应严重影响数值分析中学习问题的实际行为的假设得到了强有力的支持。转向实际上更成功和更现代的架构,在本研究结束时,我们通过关注卷积神经网络来获得改进的误差界限。

更新日期:2021-06-05
down
wechat
bug