While large self-supervised models have rivalled the performance of their supervised counterparts, small models still struggle. In this report, we explore simple baselines for improving small self-supervised models via distillation, called SimDis. Specifically, we present an offline-distillation baseline, which establishes a new state-of-the-art, and an online-distillation baseline, which achieves similar performance with minimal computational overhead. We hope these baselines will provide useful experience for relevant future research. Code is available at: https://github.com/JindongGu/SimDis/
翻译:虽然大型自监督模型与受监督的对等机构的业绩相匹敌,但小型模型仍在挣扎。我们在本报告中探索了通过蒸馏改进自监督小型模型的简单基线,称为“SimDis ” 。具体地说,我们提出了一个离线蒸馏基线,建立了新的最新技术,并提出了在线蒸馏基线,在最低计算间接费用的情况下取得了类似的业绩。我们希望这些基线将为未来相关研究提供有益的经验。代码见:https://github.com/JindongGu/SimDis/https://github. com/JindongGu/SimDis/。