The downstream accuracy of self-supervised methods is tightly linked to the proxy task solved during training and the quality of the gradients extracted from it. Richer and more meaningful gradients updates are key to allow self-supervised methods to learn better and in a more efficient manner. In a typical self-distillation framework, the representation of two augmented images are enforced to be coherent at the global level. Nonetheless, incorporating local cues in the proxy task can be beneficial and improve the model accuracy on downstream tasks. This leads to a dual objective in which, on the one hand, coherence between global-representations is enforced and on the other, coherence between local-representations is enforced. Unfortunately, an exact correspondence mapping between two sets of local-representations does not exist making the task of matching local-representations from one augmentation to another non-trivial. We propose to leverage the spatial information in the input images to obtain geometric matchings and compare this geometric approach against previous methods based on similarity matchings. Our study shows that not only 1) geometric matchings perform better than similarity based matchings in low-data regimes but also 2) that similarity based matchings are highly hurtful in low-data regimes compared to the vanilla baseline without local self-distillation. The code will be released upon acceptance.
翻译:自监督方法的下游准确性与培训期间解决的代用任务和从培训中提取的梯度的质量紧密相连。 更丰富和更有意义的梯度更新是允许自监督方法更好和更高效地学习的关键。 在典型的自我蒸馏框架内,两个增强图像的表述方式在全球范围得到强制的一致。 然而,将本地提示纳入代理任务可能是有益的,并提高下游任务的模型准确性。这导致一个双重目标,一方面,全球代表方之间的一致性得到落实,另一方面,地方代表方之间的一致性得到落实。不幸的是,两组当地代表方之间准确的对应制图方法并不存在,因此无法将本地代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方代表方