当前位置: X-MOL 学术Int. J. Artif. Intell. Tools › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Hyperparameter-free Regularization by Sampling from an Infinite Space of Neural Networks
International Journal on Artificial Intelligence Tools ( IF 1.1 ) Pub Date : 2021-03-26 , DOI: 10.1142/s0218213021500081
Thomas M. Whitehead 1
Affiliation  

Stochastic activation functions, where the output of an activation function is a hyper-parameter-free random function of the inputs, generalize the concept of dropout to sampling from an infinitely large space of related networks. Stochastic activation functions provide intrinsic regularization and sparsification of artificial neural networks, along with cheap and accurate estimates of the uncertainty in the predictions from a network. Examples are presented against standard benchmarking datasets.

中文翻译:

通过从神经网络的无限空间采样的无超参数正则化

随机激活函数,其中激活函数的输出是输入的超参数无随机函数,将 dropout 的概念推广到从相关网络的无限大空间中采样。随机激活函数提供了人工神经网络的内在正则化和稀疏化,以及对网络预测不确定性的廉价而准确的估计。针对标准基准测试数据集提供示例。
更新日期:2021-03-26
down
wechat
bug