This article introduces a multiple classifier method to improve the performance of concatenate-designed neural networks, such as ResNet and DenseNet, with the purpose to alleviate the pressure on the final classifier. We give the design of the classifiers, which collects the features produced between the network sets, and present the constituent layers and the activation function for the classifiers, to calculate the classification score of each classifier. We use the L2 normalization method to obtain the classifier score instead of the Softmax normalization. We also determine the conditions that can enhance convergence. As a result, the proposed classifiers are able to improve the accuracy in the experimental cases significantly, and show that the method not only has better performance than the original models, but also produces faster convergence. Moreover, our classifiers are general and can be applied to all classification related concatenate-designed network models.
翻译:本条引入了多种分类法,以改善ResNet和DenseNet等混合设计的神经网络的性能,目的是减轻最终分类师的压力。我们提供分类师的设计,收集网络各组之间产生的特征,并展示分类师的构成层和激活功能,以计算每个分类师的分类分数。我们使用L2正常化法来获得分类师分数,而不是软体正常化。我们还确定能够加强趋同的条件。因此,拟议的分类师能够显著提高实验案例的准确性,并表明该方法不仅比原始模型有更好的性能,而且还能更快地实现趋同。此外,我们的分类师是通用的,可以应用于所有与分类相关的组合设计网络模型。