当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Memory capacity of neural networks with threshold and ReLU activations
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2020-01-20 , DOI: arxiv-2001.06938
Roman Vershynin

Overwhelming theoretical and empirical evidence shows that mildly overparametrized neural networks -- those with more connections than the size of the training data -- are often able to memorize the training data with $100\%$ accuracy. This was rigorously proved for networks with sigmoid activation functions and, very recently, for ReLU activations. Addressing a 1988 open question of Baum, we prove that this phenomenon holds for general multilayered perceptrons, i.e. neural networks with threshold activation functions, or with any mix of threshold and ReLU activations. Our construction is probabilistic and exploits sparsity.

中文翻译:

具有阈值和 ReLU 激活的神经网络的记忆容量

压倒性的理论和经验证据表明,轻度过度参数化的神经网络——那些连接数超过训练数据大小的神经网络——通常能够以 $100\%$ 的准确度记住训练数据。这在具有 sigmoid 激活函数的网络以及最近的 ReLU 激活中得到了严格证明。解决 1988 年 Baum 的一个悬而未决的问题,我们证明这种现象适用于一般的多层感知器,即具有阈值激活函数的神经网络,或具有阈值和 ReLU 激活的任何组合。我们的构造是概率性的,并且利用了稀疏性。
更新日期:2020-06-04
down
wechat
bug