当前位置: X-MOL 学术arXiv.cs.DC › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Deep N-ary Error Correcting Output Codes
arXiv - CS - Distributed, Parallel, and Cluster Computing Pub Date : 2020-09-22 , DOI: arxiv-2009.10465
Hao Zhang, Joey Tianyi Zhou, Tianying Wang, Ivor W. Tsang, Rick Siow Mong Goh

Ensemble learning consistently improves the performance of multi-class classification through aggregating a series of base classifiers. To this end, data-independent ensemble methods like Error Correcting Output Codes (ECOC) attract increasing attention due to its easiness of implementation and parallelization. Specifically, traditional ECOCs and its general extension N-ary ECOC decompose the original multi-class classification problem into a series of independent simpler classification subproblems. Unfortunately, integrating ECOCs, especially N-ary ECOC with deep neural networks, termed as deep N-ary ECOC, is not straightforward and yet fully exploited in the literature, due to the high expense of training base learners. To facilitate the training of N-ary ECOC with deep learning base learners, we further propose three different variants of parameter sharing architectures for deep N-ary ECOC. To verify the generalization ability of deep N-ary ECOC, we conduct experiments by varying the backbone with different deep neural network architectures for both image and text classification tasks. Furthermore, extensive ablation studies on deep N-ary ECOC show its superior performance over other deep data-independent ensemble methods.

中文翻译:

深度 N 元纠错输出代码

集成学习通过聚合一系列基分类器不断提高多类分类的性能。为此,诸如纠错输出代码 (ECOC) 之类的数据独立集成方法因其易于实现和并行化而受到越来越多的关注。具体来说,传统的 ECOCs 及其一般扩展 N-ary ECOC 将原始的多类分类问题分解为一系列独立的更简单的分类子问题。不幸的是,将 ECOC,尤其是 N 元 ECOC 与深度神经网络(称为深度 N 元 ECOC)集成起来并不简单,但由于训练基础学习器的高昂费用,在文献中尚未得到充分利用。为了促进 N-ary ECOC 与深度学习基学习器的训练,我们进一步为深度 N-ary ECOC 提出了三种不同的参数共享架构变体。为了验证深度 N 元 ECOC 的泛化能力,我们通过使用不同的深度神经网络架构改变主干来进行实验,用于图像和文本分类任务。此外,对深度 N 元 ECOC 的广泛消融研究表明其性能优于其他深度数据独立集成方法。
更新日期:2020-10-21
down
wechat
bug