One of the successful approaches in semi-supervised learning is based on the consistency regularization. Typically, a student model is trained to be consistent with teacher prediction for the inputs under different perturbations. To be successful, the prediction targets given by teacher should have good quality, otherwise the student can be misled by teacher. Unfortunately, existing methods do not assess the quality of the teacher targets. In this paper, we propose a novel Certainty-driven Consistency Loss (CCL) that exploits the predictive uncertainty in the consistency loss to let the student dynamically learn from reliable targets. Specifically, we propose two approaches, i.e. Filtering CCL and Temperature CCL to either filter out uncertain predictions or pay less attention on them in the consistency regularization. We further introduce a novel decoupled framework to encourage model difference. Experimental results on SVHN, CIFAR-10, and CIFAR-100 demonstrate the advantages of our method over a few existing methods.
翻译:半监督学习的成功方法之一基于一致性规范。 典型的情况是,学生模式经过培训,与教师对不同扰动下投入的预测保持一致。 要取得成功,教师提供的预测目标应该质量良好,否则学生可能会被教师误导。 不幸的是,现有方法并不评估教师目标的质量。 在本文中,我们提出一个新的由特定因素驱动的一致损失(CCCL),利用一致性损失的预测不确定性,让学生动态地从可靠目标中学习。具体地说,我们提出了两种方法,即过滤CCL和温度CCL,以过滤不确定的预测,或在一致性规范中较少注意这些预测。我们进一步引入新的分解框架,鼓励模式差异。 SVHN、CIFAR-10和CIFAR-100的实验结果展示了我们方法在少数现有方法上的优势。