Deep classifiers have achieved great success in visual recognition. However, real-world data is long-tailed by nature, leading to the mismatch between training and testing distributions. In this report, we introduce Balanced Activation (Balanced Softmax and Balanced Sigmoid), an elegant unbiased, and simple extension of Sigmoid and Softmax activation function, to accommodate the label distribution shift between training and testing in object detection. We derive the generalization bound for multiclass Softmax regression and show our loss minimizes the bound. In our experiments, we demonstrate that Balanced Activation generally provides ~3% gain in terms of mAP on LVIS-1.0 and outperforms the current state-of-the-art methods without introducing any extra parameters.
翻译:深层分类器在视觉识别方面取得了巨大成功。 然而, 真实世界数据在自然界中是长尾的, 导致培训和测试分布的不匹配。 在本报告中, 我们引入了平衡激活( 平衡软体和平衡Sigmoid), 这是一种优雅的不偏向和简单的Sigmoid 和 Softmax 激活功能扩展, 以适应在物体检测中培训和测试之间的标签分布变化。 我们得出了多级软体回归的常规化, 并显示我们的损失最小化了约束 。 在我们的实验中, 平衡激活通常在 LVIS- 1.0 上提供~ 3% 的 mAP, 并且超越了当前最先进的方法, 而没有引入任何额外的参数 。