Models trained via empirical risk minimization (ERM) are known to rely on spurious correlations between labels and task-independent input features, resulting in poor generalization to distributional shifts. Group distributionally robust optimization (G-DRO) can alleviate this problem by minimizing the worst-case loss over a set of pre-defined groups over training data. G-DRO successfully improves performance of the worst-group, where the correlation does not hold. However, G-DRO assumes that the spurious correlations and associated worst groups are known in advance, making it challenging to apply it to new tasks with potentially multiple unknown spurious correlations. We propose AGRO -- Adversarial Group discovery for Distributionally Robust Optimization -- an end-to-end approach that jointly identifies error-prone groups and improves accuracy on them. AGRO equips G-DRO with an adversarial slicing model to find a group assignment for training examples which maximizes worst-case loss over the discovered groups. On the WILDS benchmark, AGRO results in 8% higher model performance on average on known worst-groups, compared to prior group discovery approaches used with G-DRO. AGRO also improves out-of-distribution performance on SST2, QQP, and MS-COCO -- datasets where potential spurious correlations are as yet uncharacterized. Human evaluation of ARGO groups shows that they contain well-defined, yet previously unstudied spurious correlations that lead to model errors.
翻译:通过实证风险最小化(ERM)培训的模型已知依赖标签和任务独立的投入特征之间的虚假关联,导致分布性转变的概括性差。群体分布强力优化(G-DRO)可以通过将一组预先界定的组别中最坏的损失与培训数据相比最小的情况损失最小化来缓解这一问题。G-DRO成功地改进了最坏组别的业绩,而后者的关联性并不成立。然而,G-DRO假定,虚假的关联和相关最坏组别事先就已为人所知,因此难以将其应用到可能存在多种未知的虚假关联的新任务中。我们建议AGRO -- -- 分配性小组发现分配性强的优化优化优化优化(G-DRO) -- -- 一种端对端方法,共同确定易出错误群体,提高它们的准确性。AGST-DRO(G-SGA-SG-SG-SDR)为培训范例的分组任务分配,在WILDDS(O-GS-S-SG-GO-DRAG-S-GS-GS-GS-GS-S-SDRIDR)中,该方法也包含GO-GS-S-S-S-S-S-S-S-SDARDARDAR-S-S-SDAR-SDAR-SDAR-SDAR-SDS-S-S-S-S-S-S-S-S-S-S-S-SDAR-S-S-S-S-S-S-S-SDAR-S-S-S-S-S-SDAR-S-S-S-SDAR-SDAR-SDAR-S-S-S-SDAR-S-S-S-S-S-SDAR-S-S-S-S-S-S-S-C-SDAR-SDAR-SDAR-S-S-S-SDAR-S-S-S-S-S-S-S-S-S-S-S-SDAR-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S