当前位置: X-MOL 学术Appl. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Effective node selection technique towards sparse learning
Applied Intelligence ( IF 3.4 ) Pub Date : 2020-05-15 , DOI: 10.1007/s10489-020-01720-5
Bunyodbek Ibrokhimov , Cheonghwan Hur , Sanggil Kang

Neural networks are getting wider and deeper to achieve state-of-the-art results in various machine learning domains. Such networks result in complex structures, high model size, and computational costs. Moreover, these networks are failing to adapt to new data due to their isolation in the specific domain-target space. To tackle these issues, we propose a sparse learning method to train the existing network on new classes by selecting non-crucial parameters from the network. Sparse learning also manages to keep the performance of existing classes with no additional network structure and memory costs by employing an effective node selection technique, which analyzes and selects unimportant parameters by using information theory in the neuron distribution of the fully connected layers. Our method could learn up to 40% novel classes without notable loss in the accuracy of existing classes. Through experiments, we show how a sparse learning method competes with state-of-the-art methods in terms of accuracy and even surpasses the performance of related algorithms in terms of efficiency in memory, processing speed, and overall training time. Importantly, our method can be implemented in both small and large applications, and we justify this by using well-known networks such as LeNet, AlexNet, and VGG-16.



中文翻译:

面向稀疏学习的有效节点选择技术

为了在各种机器学习领域中获得最新结果,神经网络正在变得越来越广泛和深入。这样的网络导致复杂的结构,高模型尺寸和计算成本。而且,由于这些网络在特定域目标空间中的隔离,因此它们无法适应新数据。为了解决这些问题,我们提出了一种稀疏的学习方法,通过从网络中选择非关键参数来对现有网络进行新的训练。稀疏学习还通过采用有效的节点选择技术,设法在不增加网络结构和内存成本的情况下保持现有类的性能,该技术通过在完全连接层的神经元分布中使用信息论来分析和选择不重要的参数。我们的方法可以学习多达40%的新颖类,而不会明显损失现有类的准确性。通过实验,我们展示了稀疏学习方法在准确性方面如何与最先进的方法相抗衡,甚至在内存效率,处理速度和总体培训时间方面都超过了相关算法的性能。重要的是,我们的方法既可以在小型应用程序中也可以在大型应用程序中实现,并且可以通过使用LeNet,AlexNet和VGG-16等知名网络来证明这一点。

更新日期:2020-05-15
down
wechat
bug