Learning effective visual representations that generalize well without human supervision is a fundamental problem in order to apply Machine Learning to a wide variety of tasks. Recently, two families of self-supervised methods, contrastive learning and latent bootstrapping, exemplified by SimCLR and BYOL respectively, have made significant progress. In this work, we hypothesize that adding explicit information compression to these algorithms yields better and more robust representations. We verify this by developing SimCLR and BYOL formulations compatible with the Conditional Entropy Bottleneck (CEB) objective, allowing us to both measure and control the amount of compression in the learned representation, and observe their impact on downstream tasks. Furthermore, we explore the relationship between Lipschitz continuity and compression, showing a tractable lower bound on the Lipschitz constant of the encoders we learn. As Lipschitz continuity is closely related to robustness, this provides a new explanation for why compressed models are more robust. Our experiments confirm that adding compression to SimCLR and BYOL significantly improves linear evaluation accuracies and model robustness across a wide range of domain shifts. In particular, the compressed version of BYOL achieves 76.0% Top-1 linear evaluation accuracy on ImageNet with ResNet-50, and 78.8% with ResNet-50 2x.
翻译:无需人监督即可普及有效的视觉表现,是应用机器学习应用到多种任务的根本问题。最近,两个自监督方法的家族,如SimCLR和BYOL,分别以SimCLR和BYOL为例,取得了显著进展。在这项工作中,我们假设为这些算法增加明确的信息压缩能够产生更好、更强有力的表现。我们通过开发与条件性通气博特勒克(CEB)目标兼容的SimCLR和BYOL配方来核实这一点,从而使我们能够衡量和控制所学代表方的压缩数量,并观察其对下游任务的影响。此外,我们探索利普西茨连续性和压缩之间的关系,显示我们所学的电算器的利普施茨常数的可伸缩度较低。由于利普西茨的连续性与坚固性密切相关,因此对压缩模型更加坚固的原因提供了新的解释。我们的实验证实,在SimCLR(CR)和BYOL(CEBYOL)中,大大改进了对广域变化范围的线性评价和模型的坚固度。特别是,SUPLS-50-Net-ROPER50和S-ROM-ROM-ROM-ROP-ROD-LV0的S-ROPLVAL-ROP-LVS-ROD SAD SAD SAPL SA SAD SAD SAD SAD SAPLYPLYPLM SA SA SA SALIPLM SAD SAD SAD SAL SALYPL SAD SAL SA SA SA SAL SAL SAD SAL SAD SAD SAD SAL SAL SAL SAL SAL SAL SAL SAL SA SA SAL SAL SAL SAL SAL SAL SAL SAL SAL SA SAL SAL SA SAL SAL SAL SA SA SA SA SA SA SA SAL SA SA SA SA SA SA SA SA SA SA SA SA