In this paper, we present a simple yet effective method (ABSGD) for addressing the data imbalance issue in deep learning. Our method is a simple modification to momentum SGD where we leverage an attentional mechanism to assign an individual importance weight to each gradient in the mini-batch. Unlike many existing heuristic-driven methods for tackling data imbalance, our method is grounded in {\it theoretically justified distributionally robust optimization (DRO)}, which is guaranteed to converge to a stationary point of an information-regularized DRO problem. The individual-level weight of a sampled data is systematically proportional to the exponential of a scaled loss value of the data, where the scaling factor is interpreted as the regularization parameter in the framework of information-regularized DRO. Compared with existing class-level weighting schemes, our method can capture the diversity between individual examples within each class. Compared with existing individual-level weighting methods using meta-learning that require three backward propagations for computing mini-batch stochastic gradients, our method is more efficient with only one backward propagation at each iteration as in standard deep learning methods. To balance between the learning of feature extraction layers and the learning of the classifier layer, we employ a two-stage method that uses SGD for pretraining followed by ABSGD for learning a robust classifier and finetuning lower layers. Our empirical studies on several benchmark datasets demonstrate the effectiveness of the proposed method.
翻译:在本文中,我们展示了一种简单而有效的方法(ABSGD),用以解决深层学习中的数据不平衡问题。我们的方法是对动力 SGD的简单修改。我们的方法是对动力 SGD的简单修改,即我们利用一个关注机制来给微型批量中的每个梯度分配一个个重要重量权重。与许多现有的解决数据不平衡问题的超理论驱动方法不同,我们的方法基于在理论上理论上合理的分配强力优化(DRO),这保证与信息常规化的DRO问题的一个固定点汇合。抽样数据的个人等级权重与数据缩减损失价值的指数成系统性的成比例比例比例成正比,我们把缩放系数解释为信息正规化的DRO框架中的正规化参数。与现有的班级加权制度相比,我们的方法可以捕捉到每个班级中单个例子的多样性。与现有的个人等级加权加权方法相比,即采用元学习方法,即需要三种落后的传播方法来计算微型批量的梯度梯度梯度。我们的方法更有效率,在数据缩减损失价值的指数指数值的指数值指数化参数化指数级研究中只有一次次次次次次的后传播。为了学习,我们采用的SBSGBSBSBSBS的升级方法,在学习前的升级方法之前的升级方法学习。