Ensemble learning consistently improves the performance of multi-class classification through aggregating a series of base classifiers. To this end, data-independent ensemble methods like Error Correcting Output Codes (ECOC) attract increasing attention due to its easiness of implementation and parallelization. Specifically, traditional ECOCs and its general extension N-ary ECOC decompose the original multi-class classification problem into a series of independent simpler classification subproblems. Unfortunately, integrating ECOCs, especially N-ary ECOC with deep neural networks, termed as deep N-ary ECOC, is not straightforward and yet fully exploited in the literature, due to the high expense of training base learners. To facilitate the training of N-ary ECOC with deep learning base learners, we further propose three different variants of parameter sharing architectures for deep N-ary ECOC. To verify the generalization ability of deep N-ary ECOC, we conduct experiments by varying the backbone with different deep neural network architectures for both image and text classification tasks. Furthermore, extensive ablation studies on deep N-ary ECOC show its superior performance over other deep data-independent ensemble methods.
翻译:综合学习,通过汇集一系列基础分类人员,不断提高多级分类的绩效。为此,由于执行和平行的容易度,传统经合组织及其一般扩展N-ary ECOC将原有的多级分类问题分解成一系列独立的更简单的分类子问题,传统经合组织中心及其一般扩展部分N-ary ECOC, 特别是N-ary EECC, 与称为深N-ary ECOC的深层神经网络相结合,由于培训基础学习人员的费用高昂,没有直截了当地在文献中充分利用数据独立的混合方法。为了便利N-ary ECOC与深层学习基础学习者的培训,我们进一步提出了三种不同的参数共享结构变体,用于深N-ary ECOC。为了核实深N-ary CECO的总体能力,我们用不同的骨干和不同的深层神经网络结构来进行图像和文字分类任务实验。此外,关于深N-ary EECC的大规模对比研究显示其优于其他深度数据独立方法的优异性表现。