当前位置: X-MOL 学术Comput. Math. Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
ReLU deep neural networks from the hierarchical basis perspective
Computers & Mathematics with Applications ( IF 2.9 ) Pub Date : 2022-07-08 , DOI: 10.1016/j.camwa.2022.06.006
Juncai He , Lin Li , Jinchao Xu

We study ReLU deep neural networks (DNNs) by investigating their connections with the hierarchical basis method in finite element methods. First, we show that the approximation schemes of ReLU DNNs for x2 and xy are composition versions of the hierarchical basis approximation for these two functions. Based on this fact, we obtain a geometric interpretation and systematic proof for the approximation result of ReLU DNNs for polynomials, which plays an important role in a series of recent exponential approximation results of ReLU DNNs. Through our investigation of connections between ReLU DNNs and the hierarchical basis approximation for x2 and xy, we show that ReLU DNNs with this special structure can be applied only to approximate quadratic functions. Furthermore, we obtain a concise representation to explicitly reproduce any linear finite element function on a two-dimensional uniform mesh by using ReLU DNNs with only two hidden layers.



中文翻译:

从层次基础的角度看 ReLU 深度神经网络

我们通过研究 ReLU 深度神经网络 (DNN) 与有限元方法中的层次基方法的联系来研究它们。首先,我们展示了 ReLU DNN 的近似方案X2xy是这两个函数的层次基近似的组合版本。基于这一事实,我们获得了 ReLU DNNs 对多项式的逼近结果的几何解释和系统证明,这在 ReLU DNNs 最近的一系列指数逼近结果中发挥了重要作用。通过我们对 ReLU DNN 之间的连接和层次基近似的研究X2xy,我们表明具有这种特殊结构的 ReLU DNN 只能应用于近似二次函数。此外,我们通过使用只有两个隐藏层的 ReLU DNN,获得了一个简洁的表示,以在二维均匀网格上显式再现任何线性有限元函数。

更新日期:2022-07-08
down
wechat
bug