Most combinations of NLP tasks and language varieties lack in-domain examples for supervised training because of the paucity of annotated data. How can neural models make sample-efficient generalizations from task-language combinations with available data to low-resource ones? In this work, we propose a Bayesian generative model for the space of neural parameters. We assume that this space can be factorized into latent variables for each language and each task. We infer the posteriors over such latent variables based on data from seen task-language combinations through variational inference. This enables zero-shot classification on unseen combinations at prediction time. For instance, given training data for named entity recognition (NER) in Vietnamese and for part-of-speech (POS) tagging in Wolof, our model can perform accurate predictions for NER in Wolof. In particular, we experiment with a typologically diverse sample of 33 languages from 4 continents and 11 families, and show that our model yields comparable or better results than state-of-the-art, zero-shot cross-lingual transfer methods. Moreover, we demonstrate that approximate Bayesian model averaging results in smoother predictive distributions, whose entropy inversely correlates with accuracy. Hence, the proposed framework also offers robust estimates of prediction uncertainty. Our code is located at github.com/cambridgeltl/parameter-factorization
翻译:由于缺少附加说明的数据,自然模型如何能从任务语言组合和现有数据与低资源数据进行抽样高效的概括化?在这项工作中,我们提议了一个用于神经参数空间的巴伊西亚基因化模型;我们假定,这一空间可以被作为每种语言和每项任务的潜在变量;我们根据通过变式推断从任务语言组合中看到的数据推论后人对这些潜在变量的推论。这样就可以对预测时的看不见组合进行零光分分类。例如,为越南的指定实体识别和部分标注在沃洛夫的标注而提供的培训数据,我们的模式可以对沃尔洛夫的NER进行准确预测。特别是,我们试验了来自4个大洲和11个家庭33种语言的典型样本,并显示我们的模型的收成率比标准、零点跨语言传输方法的零点分类。此外,我们还展示了越南的指定实体识别(NER)和部分标注(POOS)的培训数据,我们在沃洛夫的标注中可以对NER进行精确的预测。我们提出的碱性预测框架的准确性预测,我们还展示了在正常的Baybal-siralalalal-comlial compal compal compal compeal compal lapal ress ress ress respalpalpalpalpalpalpalpalpalpalpalpalpalpal ress ress ress respalpalpal ress ress resslationalpalpalpalpalpal ress ress ress resslation ress ress ress resslation fal ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress ress resmal ress res