当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Deep neural networks with a set of node-wise varying activation functions.
Neural Networks ( IF 7.8 ) Pub Date : 2020-03-09 , DOI: 10.1016/j.neunet.2020.03.004
Jinhyeok Jang 1 , Hyunjoong Cho 2 , Jaehong Kim 1 , Jaeyeon Lee 1 , Seungjoon Yang 2
Affiliation  

In this study, we present deep neural networks with a set of node-wise varying activation functions. The feature-learning abilities of the nodes are affected by the selected activation functions, where the nodes with smaller indices become increasingly more sensitive during training. As a result, the features learned by the nodes are sorted by the node indices in order of their importance such that more sensitive nodes are related to more important features. The proposed networks learn input features but also the importance of the features. Nodes with lower importance in the proposed networks can be pruned to reduce the complexity of the networks, and the pruned networks can be retrained without incurring performance losses. We validated the feature-sorting property of the proposed method using both shallow and deep networks as well as deep networks transferred from existing networks.

中文翻译:

具有一组逐节点变化的激活函数的深度神经网络。

在这项研究中,我们介绍了具有一组节点式变化激活函数的深度神经网络。节点的特征学习能力受所选激活功能的影响,其中具有较小索引的节点在训练过程中变得越来越敏感。结果,由节点学习到的特征按照它们的重要性的顺序通过节点索引进行排序,使得更敏感的节点与更重要的特征相关。所提出的网络不仅学习输入特征,还学习特征的重要性。可以修剪提议网络中重要性较低的节点,以降低网络的复杂性,并且可以对修剪后的网络进行重新训练而不会造成性能损失。
更新日期:2020-03-09
down
wechat
bug