当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Sparsity through evolutionary pruning prevents neuronal networks from overfitting.
Neural Networks ( IF 6.0 ) Pub Date : 2020-05-11 , DOI: 10.1016/j.neunet.2020.05.007
Richard C Gerum 1 , André Erpenbeck 2 , Patrick Krauss 3 , Achim Schilling 4
Affiliation  

Modern Machine learning techniques take advantage of the exponentially rising calculation power in new generation processor units. Thus, the number of parameters which are trained to solve complex tasks was highly increased over the last decades. However, still the networks fail – in contrast to our brain – to develop general intelligence in the sense of being able to solve several complex tasks with only one network architecture. This could be the case because the brain is not a randomly initialized neural network, which has to be trained from scratch by simply investing a lot of calculation power, but has from birth some fixed hierarchical structure. To make progress in decoding the structural basis of biological neural networks we here chose a bottom-up approach, where we evolutionarily trained small neural networks in performing a maze task. This simple maze task requires dynamic decision making with delayed rewards. We were able to show that during the evolutionary optimization random severance of connections leads to better generalization performance of the networks compared to fully connected networks. We conclude that sparsity is a central property of neural networks and should be considered for modern Machine learning approaches.



中文翻译:

通过进化修剪的稀疏性可防止神经网络过度拟合。

现代机器学习技术利用了新一代处理器单元中指数级增长的计算能力。因此,在过去的几十年中,训练有素的参数可以解决复杂的任务。但是,与我们的大脑相反,网络仍然无法开发通用智能,因为它只能用一个网络体系结构解决多个复杂的任务。可能是这种情况,因为大脑不是随机初始化的神经网络,必须通过简单地投入大量计算能力来从头开始训练它,但是从一开始就具有一些固定的层次结构。为了在解码生物神经网络的结构基础方面取得进展,我们在这里选择了一种自下而上的方法,在该方法中,我们逐步地训练了小型神经网络来执行迷宫任务。这个简单的迷宫任务需要动态的决策和延迟的奖励。我们能够证明,在进化优化过程中,与完全连接的网络相比,连接的随机中断会导致网络更好的泛化性能。我们得出结论,稀疏性是神经网络的核心属性,对于现代机器学习方法应予以考虑。

更新日期:2020-05-11
down
wechat
bug