We show that BERT (Devlin et al., 2018) is a Markov random field language model. Formulating BERT in this way gives way to a natural procedure to sample sentence from BERT. We sample sentences from BERT and find that it can produce high-quality, fluent generations. Compared to the generations of a traditional left-to-right language model, BERT generates sentences that are more diverse but of slightly worse quality.
翻译:我们显示, BERT (Devlin等人, 2018年) 是Markov 随机外野语言模式。 以这种方式制定 BERT 将自然程序让给 BERT 的句子样本。 我们从 BERT 抽取判决样本,发现它能产生高质量、流利的世代。 与传统的左对右语言模式的世代相比, BERT 生成的句子更加多样化,但质量稍差。