当前位置: X-MOL 学术Constr. Approx. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Exponential ReLU DNN Expression of Holomorphic Maps in High Dimension
Constructive Approximation ( IF 2.3 ) Pub Date : 2021-04-23 , DOI: 10.1007/s00365-021-09542-5
J. A. A. Opschoor , Ch. Schwab , J. Zech

For a parameter dimension \(d\in {\mathbb {N}}\), we consider the approximation of many-parametric maps \(u: [-\,1,1]^d\rightarrow {\mathbb R}\) by deep ReLU neural networks. The input dimension d may possibly be large, and we assume quantitative control of the domain of holomorphy of u: i.e., u admits a holomorphic extension to a Bernstein polyellipse \({{\mathcal {E}}}_{\rho _1}\times \cdots \times {{\mathcal {E}}}_{\rho _d} \subset {\mathbb {C}}^d\) of semiaxis sums \(\rho _i>1\) containing \([-\,1,1]^{d}\). We establish the exponential rate \(O(\exp (-\,bN^{1/(d+1)}))\) of expressive power in terms of the total NN size N and of the input dimension d of the ReLU NN in \(W^{1,\infty }([-\,1,1]^d)\). The constant \(b>0\) depends on \((\rho _j)_{j=1}^d\) which characterizes the coordinate-wise sizes of the Bernstein-ellipses for u. We also prove exponential convergence in stronger norms for the approximation by DNNs with more regular, so-called “rectified power unit” activations. Finally, we extend DNN expression rate bounds also to two classes of non-holomorphic functions, in particular to d-variate, Gevrey-regular functions, and, by composition, to certain multivariate probability distribution functions with Lipschitz marginals.



中文翻译:

高维全纯映射的指数ReLU DNN表示。

对于参数维\(d \ in {\ mathbb {N}} \),我们考虑多参数映射的逼近\(u:[-\,1,1] ^ d \ rightarrow {\ mathbb R} \ )通过深层ReLU神经网络。输入维d可能很大,我们假设定量控制u的全纯域:即,u允许对Bernstein多椭圆\({{\\ mathcal {E}}} _ {\ rho _1}进行全纯扩展\倍\ cdots \倍{{\ mathcal {E}}} _ {\ RHO _d} \子集{\ mathbb {C}} ^ d \)半轴总和的\(\ RHO _i> 1 \)含有\([ -\,1,1] ^ {d} \)。我们建立指数速率\(O(\ exp(-\,bN ^ {1 /(d + 1)}))\)以总NN大小N和ReLU NN的输入维度d\(W ^ {1,\ infty}([-\,1,1] ^ d)\)表示的表达能力。常数\(b> 0 \)取决于\((\ rho _j)_ {j = 1} ^ d \),它表示u的Bernstein椭圆的坐标大小。我们还证明了DNN具有更规则的,所谓的“整流功率单元”激活时,在更强的范数中具有指数收敛性。最后,我们还将DNN的表达率范围也扩展到两类非亚纯函数,尤其是d变量,Gevrey正则函数,以及按组成,具有Lipschitz边际的某些多元概率分布函数。

更新日期:2021-04-23
down
wechat
bug