The open-access dissemination of pretrained language models through online repositories has led to a democratization of state-of-the-art natural language processing (NLP) research. This also allows people outside of NLP to use such models and adapt them to specific use-cases. However, a certain amount of technical proficiency is still required which is an entry barrier for users who want to apply these models to a certain task but lack the necessary knowledge or resources. In this work, we aim to overcome this gap by providing a tool which allows researchers to leverage pretrained models without writing a single line of code. Built upon the parameter-efficient adapter modules for transfer learning, our AdapterHub Playground provides an intuitive interface, allowing the usage of adapters for prediction, training and analysis of textual data for a variety of NLP tasks. We present the tool's architecture and demonstrate its advantages with prototypical use-cases, where we show that predictive performance can easily be increased in a few-shot learning scenario. Finally, we evaluate its usability in a user study. We provide the code and a live interface at https://adapter-hub.github.io/playground.
翻译:通过在线储存库公开传播经过培训的语文模型已导致最先进的自然语言处理(NLP)研究的民主化。这也使非国家语言中心以外的人能够使用这些模型,并适应具体的使用情况。然而,仍然需要一定程度的技术熟练程度,这是用户想要将这些模型应用于某一任务但却缺乏必要知识或资源的用户的进入障碍。在这项工作中,我们的目标是通过提供一种工具,使研究人员能够利用经过培训的模型,而不必写出单行代码。在参数效率适应器模块的基础上进行传输学习,我们的适应者游戏场提供了一个直观的界面,允许调适者用于预测、培训和分析各种国家语言中心任务的文本数据。我们介绍该工具的结构,并用原型使用案例展示其优势,我们在那里显示,预测性业绩很容易在微小的学习情景中增加。最后,我们在一项用户研究中评估其可用性。我们在 https://adapapopher-gorum.