The generalization power of the pre-trained model is the key for few-shot deep learning. Dropout is a regularization technique used in traditional deep learning methods. In this paper, we explore the power of dropout on few-shot learning and provide some insights about how to use it. Extensive experiments on the few-shot object detection and few-shot image classification datasets, i.e., Pascal VOC, MS COCO, CUB, and mini-ImageNet, validate the effectiveness of our method.
翻译:培训前模型的普及能力是少数深入学习的关键。 辍学是传统深层学习方法中使用的一种正规化技术。 在本文中,我们探索了少发学习的辍学能力,并就如何使用它提供了一些见解。 关于微发天体探测和几发图像分类数据集的广泛实验,即Pascal VOC、MS COCO、CUB和Mini-ImagageNet,验证了我们的方法的有效性。