Self-supervised pre-training appears as an advantageous alternative to supervised pre-trained for transfer learning. By synthesizing annotations on pretext tasks, self-supervision allows to pre-train models on large amounts of pseudo-labels before fine-tuning them on the target task. In this work, we assess self-supervision for the diagnosis of skin lesions, comparing three self-supervised pipelines to a challenging supervised baseline, on five test datasets comprising in- and out-of-distribution samples. Our results show that self-supervision is competitive both in improving accuracies and in reducing the variability of outcomes. Self-supervision proves particularly useful for low training data scenarios ($<1\,500$ and $<150$ samples), where its ability to stabilize the outcomes is essential to provide sound results.
翻译:自我监督的训练前培训似乎是一种比受监督的接受过培训的转移学习前培训更有利的替代方法。 通过对托辞任务的说明进行综合,自我监督允许在对目标任务进行微调之前对大量假标签进行预培训模式进行预培训。在这项工作中,我们评估了用于诊断皮肤损伤的自监督观点,将三条自监督的管道与具有挑战性的受监督基线进行比较,对五套由分布样本和分布样本组成的测试数据集进行比较。我们的结果显示,自监督在改进便利性和减少结果变异性方面都具有竞争力。 自我监督对于低培训数据情景特别有用( < 1\ 500美元 和 < 150美元 样本 ), 其稳定结果的能力对于提供良好结果至关重要。