We study the problem of learning neural text classifiers without using any labeled data, but only easy-to-provide rules as multiple weak supervision sources. This problem is challenging because rule-induced weak labels are often noisy and incomplete. To address these two challenges, we design a label denoiser, which estimates the source reliability using a conditional soft attention mechanism and then reduces label noise by aggregating rule-annotated weak labels. The denoised pseudo labels then supervise a neural classifier to predicts soft labels for unmatched samples, which address the rule coverage issue. We evaluate our model on five benchmarks for sentiment, topic, and relation classifications. The results show that our model outperforms state-of-the-art weakly-supervised and semi-supervised methods consistently, and achieves comparable performance with fully-supervised methods even without any labeled data. Our code can be found at https://github.com/weakrules/Denoise-multi-weak-sources.
翻译:我们研究的是不使用任何标签数据而学习神经文字分类的问题,但仅以多种薄弱监管来源而容易提供规则的问题。 这个问题之所以具有挑战性,是因为规则引发的薄弱标签常常吵闹而且不完整。 为了应对这两项挑战,我们设计了一个标签代名词,用一个有条件的软关注机制来估计源可靠性,然后通过集成附加说明的薄弱标签来减少标签噪音。 取消名词的假标签然后监督神经分类,以预测不匹配样本的软标签,解决规则覆盖问题。 我们评估了我们关于情绪、主题和关系分类的五个基准的模型。 结果表明,我们的模型比最先进的弱监管和半监管方法一致,并在没有标签数据的情况下以完全监控的方法实现可比较的功能。 我们的代码可以在 https://github.com/weakrules/Denoise-multive-weak-resours上找到。