We address a weakly-supervised low-shot instance segmentation, an annotation-efficient training method to deal with novel classes effectively. Since it is an under-explored problem, we first investigate the difficulty of the problem and identify the performance bottleneck by conducting systematic analyses of model components and individual sub-tasks with a simple baseline model. Based on the analyses, we propose ENInst with sub-task enhancement methods: instance-wise mask refinement for enhancing pixel localization quality and novel classifier composition for improving classification accuracy. Our proposed method lifts the overall performance by enhancing the performance of each sub-task. We demonstrate that our ENInst is 7.5 times more efficient in achieving comparable performance to the existing fully-supervised few-shot models and even outperforms them at times.
翻译:我们处理的是一种监督不力的低发实例分解法,这是一种说明效率高的培训方法,可以有效处理新型课程。由于这是一个探索不足的问题,我们首先调查问题的困难,通过系统分析模型组件和单个子任务,用简单的基线模型来查明性能瓶颈。根据分析,我们建议采用子任务强化方法“ENInst ” : 以实例方式进行掩码改进,以提高像素本地化质量,并为改善分类准确性而采用新的分类方法。我们建议的方法通过提高每个子任务的业绩来提升总体业绩。我们证明,我们的“ENInst”在取得与现有的完全监督的微发模型的可比性能方面效率是7.5倍,有时甚至超出这些模型。