Partial label learning (PLL) is a typical weakly supervised learning problem, where each training example is associated with a set of candidate labels among which only one is true. Most existing PLL approaches assume that the incorrect labels in each training example are randomly picked as the candidate labels. However, this assumption is not realistic since the candidate labels are always instance-dependent. In this paper, we consider instance-dependent PLL and assume that each example is associated with a latent label distribution constituted by the real number of each label, representing the degree to each label describing the feature. The incorrect label with a high degree is more likely to be annotated as the candidate label. Therefore, the latent label distribution is the essential labeling information in partially labeled examples and worth being leveraged for predictive model training. Motivated by this consideration, we propose a novel PLL method that recovers the label distribution as a label enhancement (LE) process and trains the predictive model iteratively in every epoch. Specifically, we assume the true posterior density of the latent label distribution takes on the variational approximate Dirichlet density parameterized by an inference model. Then the evidence lower bound is deduced for optimizing the inference model and the label distributions generated from the variational posterior are utilized for training the predictive model. Experiments on benchmark and real-world datasets validate the effectiveness of the proposed method. Source code is available at https://github.com/palm-ml/valen.
翻译:部分标签学习( PLL) 是一个典型的受监管薄弱的学习问题, 每个培训示例都与一组候选人标签相关, 其中只有一个是真实的。 多数现有的 PLL 方法假定, 每个培训示例中的错误标签被随机选为候选人标签。 但是, 这个假设是不现实的, 因为候选人标签总是以实例为依存。 在本文中, 我们考虑以实例为依存的 PLLL 程序, 并假设每个示例都与由每个标签的真实数量构成的潜在标签分布相关, 代表每个标签描述特性的度。 高度不正确的标签更有可能被作为候选人标签加以注解。 因此, 潜在标签分布是部分标签示例中的基本标签信息, 值得用于预测模型培训。 受此考虑的驱动, 我们提出一个新的 PLLLL 方法, 恢复标签分布作为标签增强( LE) 进程, 并在每一个地方以迭代方式对预测模型进行培训。 具体地, 我们假设潜在标签分配的真实的后表密度密度, 以差异为模型/ Dirivalent revilent imal imal imal adviewal imal imal imational exliverational extigradududuction the the the the lavidududududududududutional dislational viewd the the the viewviewviewviewdaldaldaldald viewdaldaldaldaldaldaldaldald view view