Object detection aims to localize and classify the objects in a given image, and these two tasks are sensitive to different object regions. Therefore, some locations predict high-quality bounding boxes but low classification scores, and some locations are quite the opposite. A misalignment exists between the two tasks, and their features are spatially entangled. In order to solve the misalignment problem, we propose a plug-in Spatial-disentangled and Task-aligned operator (SALT). By predicting two task-aware point sets that are located in each task's sensitive regions, SALT can reassign features from those regions and align them to the corresponding anchor point. Therefore, features for the two tasks are spatially aligned and disentangled. To minimize the difference between the two regression stages, we propose a Self-distillation regression (SDR) loss that can transfer knowledge from the refined regression results to the coarse regression results. On the basis of SALT and SDR loss, we propose SALT-Net, which explicitly exploits task-aligned point-set features for accurate detection results. Extensive experiments on the MS-COCO dataset show that our proposed methods can consistently boost different state-of-the-art dense detectors by $\sim$2 AP. Notably, SALT-Net with Res2Net-101-DCN backbone achieves 53.8 AP on the MS-COCO test-dev.
翻译:对象检测旨在将对象定位和分类于给定图像中,这两个任务对不同的目标区域敏感。 因此, 一些位置预测出高质量的捆绑框, 但分类分数较低, 有些位置正好相反。 两个任务之间存在不匹配, 其特征在空间上相互缠绕。 为了解决不匹配问题, 我们提议了一个插插式空间分解和与任务一致的操作员( SALT) 。 通过预测位于每个任务敏感区域的两套任务认知点数据集, SALT 可以重新指定这些区域的特征, 并将它们与相应的锚点相匹配。 因此, 这两项任务的特点在空间上是一致和分解的。 为了尽可能缩小两个回归阶段之间的差异, 我们提议了一个自我淡化回归(SDR)损失, 将知识从精细的回归结果转移到粗微的回归结果。 根据SALT和特别提款权损失, 我们提议建立SALT- Net, 明确利用任务组合点设置的特征来准确检测结果。 在MS- CO COM 网络上进行广泛的实验, 通过MS- 101 ISO AS AS AS ASB ASU ASUD ASB ASD ASD ASD ASD ASUD ASU ASU ASUD SALD ASD ASD ASUD ASUD ASUD SALD ASD ASD ASD ASD ASD ASD ASD SALD ASD ASD ASD ASD ASV ASD ASV ASV ASD ASV ASD ASD ASD ASD ASD ASD ASD ASD ASD ASD ASD ASD ASD ASD ASD ASD ASD ASD ASD ASD ASD SAD ASD ASDGLDGLD SAD ASD SAD ASD ASD SAD ASD ASD SAD AS ASD SAD AS AS ASD SAD ASD SAD SAD SAD SAD SAD SAD AS