当前位置: X-MOL 学术arXiv.cs.LG › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Improving Neural Network with Uniform Sparse Connectivity
arXiv - CS - Machine Learning Pub Date : 2020-11-29 , DOI: arxiv-2011.14420
Weijun Luo

Neural network forms the foundation of deep learning and numerous AI applications. Classical neural networks are fully connected, expensive to train and prone to overfitting. Sparse networks tend to have convoluted structure search, suboptimal performance and limited usage. We proposed the novel uniform sparse network (USN) with even and sparse connectivity within each layer. USN has one striking property that its performance is independent of the substantial topology variation and enormous model space, thus offers a search-free solution to all above mentioned issues of neural networks. USN consistently and substantially outperforms the state-of-the-art sparse network models in prediction accuracy, speed and robustness. It even achieves higher prediction accuracy than the fully connected network with only 0.55% parameters and 1/4 computing time and resources. Importantly, USN is conceptually simple as a natural generalization of fully connected network with multiple improvements in accuracy, robustness and scalability. USN can replace the latter in a range of applications, data types and deep learning architectures. We have made USN open source at https://github.com/datapplab/sparsenet.

中文翻译:

通过均匀的稀疏连接性改善神经网络

神经网络构成了深度学习和众多AI应用程序的基础。经典的神经网络是完全连接的,训练起来很昂贵并且容易过度拟合。稀疏网络往往具有复杂的结构搜索,性能欠佳和使用受限的问题。我们提出了新颖的统​​一稀疏网络(USN),在每个层中具有均匀和稀疏的连接性。USN具有一个惊人的特性,即它的性能独立于实质的拓扑变化和巨大的模型空间,因此为上述神经网络的所有问题提供了一种免搜索的解决方案。在预测准确度,速度和鲁棒性方面,USN始终显着优于最新的稀疏网络模型。它甚至比只有0的全连接网络具有更高的预测精度。55%的参数和1/4的计算时间和资源。重要的是,USN在概念上很简单,是对全连接网络的自然概括,在准确性,鲁棒性和可伸缩性方面进行了多次改进。USN可以在一系列应用程序,数据类型和深度学习架构中替代后者。我们已在https://github.com/datapplab/sparsenet上实现了USN开源。
更新日期:2020-12-01
down
wechat
bug