Curation of large fully supervised datasets has become one of the major roadblocks for machine learning. Weak supervision provides an alternative to supervised learning by training with cheap, noisy, and possibly correlated labeling functions from varying sources. The key challenge in weakly supervised learning is combining the different weak supervision signals while navigating misleading correlations in their errors. In this paper, we propose a simple data-free approach for combining weak supervision signals by defining a constrained space for the possible labels of the weak signals and training with a random labeling within this constrained space. Our method is efficient and stable, converging after a few iterations of gradient descent. We prove theoretical conditions under which the worst-case error of the randomized label decreases with the rank of the linear constraints. We show experimentally that our method outperforms other weak supervision methods on various text- and image-classification tasks.
翻译:大量完全监管的数据集的缩小已成为机器学习的主要障碍之一。 薄弱的监管提供了替代监督学习的替代方法,通过培训从不同来源获得廉价、吵闹和可能有关联的标签功能。 监管薄弱的学习的关键挑战是将不同的薄弱监督信号结合起来,同时在错误中引入误导性相关关系。 在本文中,我们提出一个简单的无数据方法,将薄弱监督信号结合起来,方法是为薄弱信号的可能标签确定一个限制的空间,并在这一受限制的空间中随机进行标记。 我们的方法是高效和稳定的,在几处梯度下降之后相互融合。 我们证明,在理论上,随机化标签的最坏的错误会随着线性限制的等级而减少。 我们实验性地表明,我们的方法在各种文本和图像分类任务上比其他薄弱的监督方法要差。