Learning to improve AUC performance is an important topic in machine learning. However, AUC maximization algorithms may decrease generalization performance due to the noisy data. Self-paced learning is an effective method for handling noisy data. However, existing self-paced learning methods are limited to pointwise learning, while AUC maximization is a pairwise learning problem. To solve this challenging problem, we innovatively propose a balanced self-paced AUC maximization algorithm (BSPAUC). Specifically, we first provide a statistical objective for self-paced AUC. Based on this, we propose our self-paced AUC maximization formulation, where a novel balanced self-paced regularization term is embedded to ensure that the selected positive and negative samples have proper proportions. Specially, the sub-problem with respect to all weight variables may be non-convex in our formulation, while the one is normally convex in existing self-paced problems. To address this, we propose a doubly cyclic block coordinate descent method. More importantly, we prove that the sub-problem with respect to all weight variables converges to a stationary point on the basis of closed-form solutions, and our BSPAUC converges to a stationary point of our fixed optimization objective under a mild assumption. Considering both the deep learning and kernel-based implementations, experimental results on several large-scale datasets demonstrate that our BSPAUC has a better generalization performance than existing state-of-the-art AUC maximization methods.
翻译:改进AUC的学习是机器学习的一个重要主题。 然而, AUC 最大化算法可能会降低由于数据噪音而导致的通用性能。 自我节奏学习是处理噪音数据的有效方法。 但是, 现有的自我节奏学习方法仅限于点式学习, 而AUC 最大化则是一个双向学习问题。 为了解决这个具有挑战性的问题, 我们创新地提出了平衡的自我节奏最大化算法( BSPAUC ) 。 具体地说, 我们首先为自我节奏的AUC提供了一种统计目标。 在此基础上, 我们提出了我们自行节奏的AUC最大化公式, 这是一种新颖的平衡自我节奏自我节奏正规化术语, 以确保所选的正负的样本具有适当的比例。 特别是, 关于所有重量变量的子节节节奏, 我们的子节奏最大化算法, 通常在现有的自我节奏最大化算法中, 我们的双级平级平级平级平级平级平级平级平级平级平级平级平级平级平级平级平级平级平级平级平级平级。