当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Multi-Activation Hidden Units for Neural Networks with Random Weights
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2020-09-06 , DOI: arxiv-2009.08932
Ajay M. Patrikar

Single layer feedforward networks with random weights are successful in a variety of classification and regression problems. These networks are known for their non-iterative and fast training algorithms. A major drawback of these networks is that they require a large number of hidden units. In this paper, we propose the use of multi-activation hidden units. Such units increase the number of tunable parameters and enable formation of complex decision surfaces, without increasing the number of hidden units. We experimentally show that multi-activation hidden units can be used either to improve the classification accuracy, or to reduce computations.

中文翻译:

具有随机权重的神经网络的多重激活隐藏单元

具有随机权重的单层前馈网络在各种分类和回归问题中取得了成功。这些网络以其非迭代和快速训练算法而闻名。这些网络的一个主要缺点是它们需要大量的隐藏单元。在本文中,我们建议使用多激活隐藏单元。这样的单元增加了可调参数的数量并能够形成复杂的决策面,而不会增加隐藏单元的数量。我们通过实验表明,多激活隐藏单元可用于提高分类精度或减少计算量。
更新日期:2020-09-25
down
wechat
bug