Transfer learning, particularly approaches that combine multi-task learning with pre-trained contextualized embeddings and fine-tuning, have advanced the field of Natural Language Processing tremendously in recent years. In this paper we present MaChAmp, a toolkit for easy fine-tuning of contextualized embeddings in multi-task settings. The benefits of MaChAmp are its flexible configuration options, and the support of a variety of natural language processing tasks in a uniform toolkit, from text classification and sequence labeling to dependency parsing, masked language modeling, and text generation.
翻译:转让学习,特别是将多任务学习与预先培训的环境化嵌入和微调相结合的方法,近年来极大地推进了自然语言处理领域,在本文中,我们介绍了MachAmp,这是一个方便微调多任务环境中背景化嵌入的工具包。MachAmp的优点是其灵活的配置选项,以及在统一工具包中支持各种自然语言处理任务,从文本分类和顺序标签到依赖性分类、遮盖语言建模和文本生成。