While recent studies on semi-supervised learning have shown remarkable progress in leveraging both labeled and unlabeled data, most of them presume a basic setting of the model is randomly initialized. In this work, we consider semi-supervised learning and transfer learning jointly, leading to a more practical and competitive paradigm that can utilize both powerful pre-trained models from source domain as well as labeled/unlabeled data in the target domain. To better exploit the value of both pre-trained weights and unlabeled target examples, we introduce adaptive consistency regularization that consists of two complementary components: Adaptive Knowledge Consistency (AKC) on the examples between the source and target model, and Adaptive Representation Consistency (ARC) on the target model between labeled and unlabeled examples. Examples involved in the consistency regularization are adaptively selected according to their potential contributions to the target task. We conduct extensive experiments on popular benchmarks including CIFAR-10, CUB-200, and MURA, by fine-tuning the ImageNet pre-trained ResNet-50 model. Results show that our proposed adaptive consistency regularization outperforms state-of-the-art semi-supervised learning techniques such as Pseudo Label, Mean Teacher, and FixMatch. Moreover, our algorithm is orthogonal to existing methods and thus able to gain additional improvements on top of MixMatch and FixMatch. Our code is available at https://github.com/SHI-Labs/Semi-Supervised-Transfer-Learning.
翻译:虽然最近关于半监督学习的研究显示,在利用有标签和无标签数据方面取得了显著进展,但大多数研究都假定模型的基本设置是随机初始的。在这项工作中,我们认为半监督学习和转让学习是联合进行的,从而形成一个更加实用和竞争性的范例,既可以利用源域的强力预先培训模型,也可以利用目标域的标签/无标签数据。为了更好地利用预先培训的重量和无标签目标示例的价值,我们引入了适应性一致性规范化,其中包括两个互补组成部分:源和目标模型之间范例的适应性知识一致性(AKC),以及标的和无标签实例之间的目标模型的适应性代表性一致性(ARC),从而导致能够根据对目标任务的潜在贡献,对一致性规范化的范例进行适应性选择。我们通过微调的图像网络前培训的透明性ResNet-50模型,我们提议的适应性一致性规范化超出源和目标模型,在Squalbi-S-Sqreal-Lassimal-Lafer-Silmail-Sil-Silfail-Sil-Sil-Silmafail-I-Mil-Mal-Silfervial-Mal-Lass-Lass-Lass-Sil-Lis-Lis-Sild-Silva-Lis-Lis-Lis-Silmais-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Silva-L)的现有方法上,我们提议的适应性调整和Sil-Sy-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-L-S-S-L-L-S-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-Sil-L-S-S-S-S-S-S-S-S-S-S-S-S-S-S-L-S-S-S-S-S-S-S-S-S