Deep regression models typically learn in an end-to-end fashion and do not explicitly try to learn a regression-aware representation. Their representations tend to be fragmented and fail to capture the continuous nature of regression tasks. In this paper, we propose Supervised Contrastive Regression (SupCR), a framework that learns a regression-aware representation by contrasting samples against each other based on their target distance. SupCR is orthogonal to existing regression models, and can be used in combination with such models to improve performance. Extensive experiments using five real-world regression datasets that span computer vision, human-computer interaction, and healthcare show that using SupCR achieves the state-of-the-art performance and consistently improves prior regression baselines on all datasets, tasks, and input modalities. SupCR also improves robustness to data corruptions, resilience to reduced training data, performance on transfer learning, and generalization to unseen targets.
翻译:深回归模型通常以端到端的方式学习,并且没有明确地尝试学习回归意识代表法。它们的表述方式往往支离破碎,无法捕捉回归任务的持续性质。 在本文中,我们提出监督反回归(SupCR)(SupCR)(SupCR)(SupCR)(SupCR)(SupCR)(SupcR)(Supb)(SupbCR)(Supb)(Supb)(Supb)(Supb))(Supb)(Supb)(Supb) (Supb) (S) (Supb) (Supb) (Sp) (Sp) (SupCR) ) (Supb) (Supb) (Sp) (Sp) (Supb) (Supb) (S) (Supb) ) (Supread) (Supret) ) ) (Suptional) 。 Spulational) (Spulational) (Sulation (Sulation) (Sulation) (Spat) (Spat) (Spat) (Spat) 。