当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Efficient architecture for deep neural networks with heterogeneous sensitivity
Neural Networks ( IF 6.0 ) Pub Date : 2020-11-10 , DOI: 10.1016/j.neunet.2020.10.017
Hyunjoong Cho , Jinhyeok Jang , Chanhyeok Lee , Seungjoon Yang

In this study, we present a neural network that consists of nodes with heterogeneous sensitivity. Each node in a network is assigned a variable that determines the sensitivity with which it learns to perform a given task. The network is trained via a constrained optimization that maximizes the sparsity of the sensitivity variables while ensuring optimal network performance. As a result, the network learns to perform a given task using only a few sensitive nodes. Insensitive nodes, which are nodes with zero sensitivity, can be removed from a trained network to obtain a computationally efficient network. Removing zero-sensitivity nodes has no effect on the performance of the network because the network has already been trained to perform the task without them. The regularization parameter used to solve the optimization problem was simultaneously found during the training of the networks. To validate our approach, we designed networks with computationally efficient architectures for various tasks such as autoregression, object recognition, facial expression recognition, and object detection using various datasets. In our experiments, the networks designed by our proposed method provided the same or higher performances but with far less computational complexity.



中文翻译:

具有异类敏感性的深度神经网络的高效架构

在这项研究中,我们提出了一个神经网络,其中包括具有异构敏感性的节点。为网络中的每个节点分配一个变量,该变量确定其学习执行给定任务的敏感性。通过有约束的优化来训练网络,该优化使灵敏度变量的稀疏性最大化,同时确保了最佳的网络性能。结果,网络仅使用几个敏感节点学习执行给定任务。可以从训练有素的网络中删除敏感度为零的不敏感节点,从而获得计算效率高的网络。删除零敏感度节点不会影响网络性能,因为已经训练了网络在没有它们的情况下执行任务。在训练网络的同时发现了用于解决优化问题的正则化参数。为了验证我们的方法,我们设计了具有计算有效架构的网络,用于各种任务,例如自回归,对象识别,面部表情识别以及使用各种数据集的对象检测。在我们的实验中,通过我们提出的方法设计的网络提供了相同或更高的性能,但计算复杂度却低得多。

更新日期:2020-12-07
down
wechat
bug