当前位置: X-MOL 学术IEEE Trans. Neural Netw. Learn. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Class-Imbalanced Deep Learning via a Class-Balanced Ensemble.
IEEE Transactions on Neural Networks and Learning Systems ( IF 10.2 ) Pub Date : 2021-04-26 , DOI: 10.1109/tnnls.2021.3071122
Zhi Chen 1 , Jiang Duan 1 , Li Kang 1 , Guoping Qiu 2
Affiliation  

Class imbalance is a prevalent phenomenon in various real-world applications and it presents significant challenges to model learning, including deep learning. In this work, we embed ensemble learning into the deep convolutional neural networks (CNNs) to tackle the class-imbalanced learning problem. An ensemble of auxiliary classifiers branching out from various hidden layers of a CNN is trained together with the CNN in an end-to-end manner. To that end, we designed a new loss function that can rectify the bias toward the majority classes by forcing the CNN's hidden layers and its associated auxiliary classifiers to focus on the samples that have been misclassified by previous layers, thus enabling subsequent layers to develop diverse behavior and fix the errors of previous layers in a batch-wise manner. A unique feature of the new method is that the ensemble of auxiliary classifiers can work together with the main CNN to form a more powerful combined classifier, or can be removed after finished training the CNN and thus only acting the role of assisting class imbalance learning of the CNN to enhance the neural network's capability in dealing with class-imbalanced data. Comprehensive experiments are conducted on four benchmark data sets of increasing complexity (CIFAR-10, CIFAR-100, iNaturalist, and CelebA) and the results demonstrate significant performance improvements over the state-of-the-art deep imbalance learning methods.

中文翻译:

通过类平衡的集合进行类不平衡的深度学习。

类不平衡是各种现实应用中普遍存在的现象,它给包括深度学习在内的模型学习提出了严峻的挑战。在这项工作中,我们将集成学习嵌入到深度卷积神经网络(CNN)中,以解决类不平衡的学习问题。从CNN的各个隐藏层中分支出来的一组辅助分类器与CNN一起以端到端的方式进行训练。为此,我们设计了一个新的损失函数,该函数可以通过迫使CNN的隐藏层及其关联的辅助分类器集中处理被先前层误分类的样本,从而纠正对多数类的偏见,从而使后续层能够发展多样化行为,并以批处理的方式修复前几层的错误。新方法的独特之处在于,辅助分类器的集合可以与主CNN协同工作,以形成功能更强大的组合分类器,或者可以在完成CNN训练后将其删除,从而仅起到辅助学习班级不平衡的作用CNN,以增强神经网络处理类不平衡数据的能力。在越来越复杂的四个基准数据集(CIFAR-10,CIFAR-100,iNaturalist和CelebA)上进行了全面的实验,结果表明,与最先进的深度失衡学习方法相比,该方法的性能得到了显着改善。或可以在完成CNN训练后将其删除,从而仅起到辅助CNN类别不平衡学习的作用,以增强神经网络处理类别不平衡数据的能力。在越来越复杂的四个基准数据集(CIFAR-10,CIFAR-100,iNaturalist和CelebA)上进行了全面的实验,结果表明,与最先进的深度失衡学习方法相比,该方法的性能得到了显着改善。或可以在完成CNN训练后将其删除,从而仅起到辅助CNN类别不平衡学习的作用,以增强神经网络处理类别不平衡数据的能力。在越来越复杂的四个基准数据集(CIFAR-10,CIFAR-100,iNaturalist和CelebA)上进行了全面的实验,结果表明,与最先进的深度失衡学习方法相比,该方法的性能得到了显着改善。
更新日期:2021-04-26
down
wechat
bug