当前位置: X-MOL 学术Pattern Recognit. Image Anal. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Hyperparameters of Multilayer Perceptron with Normal Distributed Weights
Pattern Recognition and Image Analysis Pub Date : 2020-06-19 , DOI: 10.1134/s1054661820020054
Y. Karaki , N. Ivanov

Abstract

Multilayer Perceptrons, Recurrent neural networks, Convolutional networks, and others types of neural networks are widespread nowadays. Neural Networks have hyperparameters like number of hidden layers, number of units for each hidden layer, learning rate, and activation function. Bayesian Optimization is one of the methods used for tuning hyperparameters. Usually this technique treats values of neurons in network as stochastic Gaussian processes. This article reports experimental results on multivariate normality test and proves that the neuron vectors are considerably far from Gaussian distribution.


中文翻译:

具有正态分布权重的多层感知器的超参数

摘要

如今,多层感知器,递归神经网络,卷积网络和其他类型的神经网络非常普遍。神经网络具有超参数,例如隐藏层数,每个隐藏层的单位数,学习率和激活功能。贝叶斯优化是用于调整超参数的方法之一。通常,该技术将网络中神经元的值视为随机的高斯过程。本文报告了多元正态性检验的实验结果,并证明了神经元向量与高斯分布有很大距离。
更新日期:2020-06-19
down
wechat
bug