In this paper, we propose mean squared error (MSE) loss with outlying label for class imbalanced classification. Cross entropy (CE) loss, which is widely used for image recognition, is learned so that the probability value of true class is closer to one by back propagation. However, for imbalanced datasets, the learning is insufficient for the classes with a small number of samples. Therefore, we propose a novel classification method using the MSE loss that can be learned the relationships of all classes no matter which image is input. Unlike CE loss, MSE loss is possible to equalize the number of back propagation for all classes and to learn the feature space considering the relationships between classes as metric learning. Furthermore, instead of the usual one-hot teacher label, we use a novel teacher label that takes the number of class samples into account. This induces the outlying label which depends on the number of samples in each class, and the class with a small number of samples has outlying margin in a feature space. It is possible to create the feature space for separating high-difficulty classes and low-difficulty classes. By the experiments on imbalanced classification and semantic segmentation, we confirmed that the proposed method was much improved in comparison with standard CE loss and conventional methods, even though only the loss and teacher labels were changed.
翻译:在本文中, 我们提议了一种新颖的分类方法, 使用 MSE 损失可以学习所有类别的关系, 任何图像都是不起作用的。 与 CE 损失不同, MSE 损失有可能使所有类的背传播数量相等, 并学习将各类之间的关系作为衡量学习的特征空间。 此外, 我们使用一种新颖的教师标签, 而不是通常的单热教师标签, 将类样本的数量考虑在内。 这诱导出一个取决于每类样本数量的外围标签, 而具有少量样本的类别在地貌空间中比值小。 与 CE 损失不同, MSE 损失有可能为将所有类的背传播数量等同起来, 并学习将各类之间的关系作为衡量学习的特征空间。 此外, 我们使用了一个新颖的教师标签, 而不是通常的单热教师标签, 将类样本的数量考虑在内。 这导致一个取决于每类样本数量的小类样本数量的外围标签, 在一个特性空间里, 少数类的样本具有超越的边缘空间。 有可能创建特性空间, 将高难度班级和低难度类之间的教学班进行区分, 并且通过常规分类和标签损失方法的对比, 我们的分类和标准的分类的分类的分类和标签分析方法被确认了。