In this work we present a Mixture of Task-Aware Experts Network for Machine Reading Comprehension on a relatively small dataset. We particularly focus on the issue of common-sense learning, enforcing the common ground knowledge by specifically training different expert networks to capture different kinds of relationships between each passage, question and choice triplet. Moreover, we take inspi ration on the recent advancements of multitask and transfer learning by training each network a relevant focused task. By making the mixture-of-networks aware of a specific goal by enforcing a task and a relationship, we achieve state-of-the-art results and reduce over-fitting.
翻译:在这项工作中,我们展示了一个关于相对较小的数据集的“机器阅读理解任务-软件专家网络”的混合体,我们特别侧重于常识学习问题,通过专门培训不同的专家网络,以掌握每个通道、问题和选择之间的不同类型关系,加强共同点知识。此外,我们把最近多任务进展的配给加注,通过培训每个网络将学习转移给一个相关的重点任务。通过执行一项任务和一种关系,使混合网络意识到一个具体目标,我们取得了最先进的成果,减少了过度使用。