In real world scenarios, out-of-distribution (OOD) datasets may have a large distributional shift from training datasets. This phenomena generally occurs when a trained classifier is deployed on varying dynamic environments, which causes a significant drop in performance. To tackle this issue, we are proposing an end-to-end deep multi-task network in this work. Observing a strong relationship between rotation prediction (self-supervised) accuracy and semantic classification accuracy on OOD tasks, we introduce an additional auxiliary classification head in our multi-task network along with semantic classification and rotation prediction head. To observe the influence of this addition classifier in improving the rotation prediction head, our proposed learning method is framed into bi-level optimisation problem where the upper-level is trained to update the parameters for semantic classification and rotation prediction head. In the lower-level optimisation, only the auxiliary classification head is updated through semantic classification head by fixing the parameters of the semantic classification head. The proposed method has been validated through three unseen OOD datasets where it exhibits a clear improvement in semantic classification accuracy than other two baseline methods. Our code is available on GitHub \url{https://github.com/harshita-555/OSSL}
翻译:在现实世界的情景中,分配外(OOD)数据集可能与培训数据集发生很大的分布性转变。这种现象一般发生在训练有素的分类员在不同动态环境中部署,造成性能显著下降的情况下。为了解决这个问题,我们提议在这项工作中建立一个端到端深的多任务网络。在OOD任务的旋转预测(自我监督)准确性和语义分类准确性之间保持密切的关系,我们在我们的多任务网络中加上一个辅助分类头,并加上语义分类和旋转预测头。为了观察增加的分类员在改进旋转预测头方面的影响,我们拟议的学习方法被设置为双级优化问题,因为高层受过培训,可以更新语义分类和旋转预测头的参数。在较低级别的选择中,只有辅助分类头通过确定语义分类头的参数来更新。拟议的方法已经通过三个可见的 OOD 数据集验证,在那里显示语义分类精确性比其他两种基线方法有明显改进。 我们的代码在GiAlma/Lassima/Gial/Lassimar}