当前位置: X-MOL 学术Pattern Recogn. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
On the expected behaviour of noise regularised deep neural networks as Gaussian processes
Pattern Recognition Letters ( IF 5.1 ) Pub Date : 2020-06-30 , DOI: 10.1016/j.patrec.2020.06.027
Arnu Pretorius , Herman Kamper , Steve Kroon

Recent work has established the equivalence between deep neural networks and Gaussian processes (GPs), resulting in so-called neural network Gaussian processes (NNGPs). The behaviour of these models depends on the initialisation of the corresponding network. In this work, we consider the impact of noise regularisation (e.g. dropout) on NNGPs, and relate their behaviour to signal propagation theory in noise regularised deep neural networks. For ReLU activations, we find that the best performing NNGPs have kernel parameters that correspond to a recently proposed initialisation scheme for noise regularised ReLU networks. In addition, we show how the noise influences the covariance matrix of the NNGP, producing a stronger prior towards simple functions away from the training points. We verify our theoretical findings with experiments on MNIST and CIFAR-10 as well as on synthetic data.



中文翻译:

关于噪声正则化深度神经网络作为高斯过程的预期行为

最近的工作建立了深度神经网络与高斯过程(GPs)之间的等价关系,从而产生了所谓的神经网络高斯过程(NNGP)。这些模型的行为取决于相应网络的初始化。在这项工作中,我们考虑了噪声正则化(例如丢包)对NNGP的影响,并将它们的行为与噪声正则化深度神经网络中的信号传播理论联系起来。对于ReLU激活,我们发现性能最好的NNGP具有对应于最近提出的针对噪声正则化ReLU网络的初始化方案的内核参数。此外,我们展示了噪声如何影响NNGP的协方差矩阵,从而在远离训练点的简单功能上产生了更强的先验性。我们通过在MNIST和CIFAR-10上进行的实验以及在综合数据上验证了我们的理论发现。

更新日期:2020-07-13
down
wechat
bug