Neural Computation ( IF 2.7 ) Pub Date : 2021-01-29 , DOI: 10.1162/neco_a_01364 Zuowei Shen 1 , Haizhao Yang 1 , Shijun Zhang 1
A new network with super-approximation power is introduced. This network is built with Floor () or ReLU () activation function in each neuron; hence, we call such networks Floor-ReLU networks. For any hyperparameters and , we show that Floor-ReLU networks with width and depth can uniformly approximate a Hölder function on with an approximation error , where and are the Hölder order and constant, respectively. More generally for an arbitrary continuous function on with a modulus of continuity , the constructive approximation rate is . As a consequence, this new class of networks overcomes the curse of dimensionality in approximation power when the variation of as is moderate (e.g., for Hölder continuous functions), since the major term to be considered in our approximation rate is essentially times a function of and independent of within the modulus of continuity.
中文翻译:
具有近似误差为宽度与深度平方根次方的倒数的深度网络
介绍了一种具有超逼近能力的新网络。这个网络是用 Floor () 或 ReLU () 每个神经元的激活函数;因此,我们称这种网络为 Floor-ReLU 网络。对于任何超参数 和 ,我们展示了具有宽度的 Floor-ReLU 网络 和深度 可以一致地逼近一个 Hölder 函数 在 有近似误差 , 在哪里 和 分别是 Hölder 阶和常数。更一般地用于任意连续函数 在 具有连续性模数 ,建设性逼近率为 . 因此,当 作为 是中等的(例如, 对于 Hölder 连续函数),因为在我们的近似率中要考虑的主要项本质上是 次函数 和 独立于 在连续性模数内。