When a human communicates with a machine using natural language on the web and online, how can it understand the human's intention and semantic context of their talk? This is an important AI task as it enables the machine to construct a sensible answer or perform a useful action for the human. Meaning is represented at the sentence level, identification of which is known as intent detection, and at the word level, a labelling task called slot filling. This dual-level joint task requires innovative thinking about natural language and deep learning network design, and as a result, many approaches and models have been proposed and applied. This tutorial will discuss how the joint task is set up and introduce Spoken Language Understanding/Natural Language Understanding (SLU/NLU) with Deep Learning techniques. We will cover the datasets, experiments and metrics used in the field. We will describe how the machine uses the latest NLP and Deep Learning techniques to address the joint task, including recurrent and attention-based Transformer networks and pre-trained models (e.g. BERT). We will then look in detail at a network that allows the two levels of the task, intent classification and slot filling, to interact to boost performance explicitly. We will do a code demonstration of a Python notebook for this model and attendees will have an opportunity to watch coding demo tasks on this joint NLU to further their understanding.
翻译:当人类与在网络和在线上使用自然语言的机器进行交流时,它如何理解人类的意图和其谈话的语义背景?这是一个重要的AI任务,因为它使机器能够构建一个明智的答案或为人类采取有用的行动。含义在句级代表,其识别被称为意图检测,在字级代表一个称为补缺的标签任务。这一双重联合任务要求对自然语言进行创新思维和深层学习网络设计,因此,提出了许多办法和模型并加以应用。这个辅导将讨论如何建立联合任务,并采用深学习技术引入“假语言理解/自然语言理解”(SLU/NLU) 。我们将涵盖实地使用的数据集、实验和计量。我们将描述机器如何使用最新的NLP和深层学习技术应对联合任务,包括经常性和以关注为基础的变换网络和预先训练模型(例如BERT)。我们随后将详细研究一个网络,让任务的两个级别能够建立共同任务、意图分类和阵列的阵列,我们将明确展示一个演示任务,我们将使用这个机会的模型进行互动。