These lecture notes focus on the recent advancements in neural information retrieval, with particular emphasis on the systems and models exploiting transformer networks. These networks, originally proposed by Google in 2017, have seen a large success in many natural language processing and information retrieval tasks. While there are many fantastic textbook on information retrieval and natural language processing as well as specialised books for a more advanced audience, these lecture notes target people aiming at developing a basic understanding of the main information retrieval techniques and approaches based on deep learning. These notes have been prepared for a IR graduate course of the MSc program in Artificial Intelligence and Data Engineering at the University of Pisa, Italy.
翻译:这些讲座着重介绍神经信息检索方面最近取得的进展,特别强调利用变压器网络的系统和模型,这些网络最初由谷歌于2017年提议,在许多自然语言处理和信息检索任务中取得了巨大成功。虽然有许多关于信息检索和自然语言处理的优秀教科书,以及供更高级受众使用的专门书籍,但这些讲座针对的对象是旨在对主要信息检索技术和基于深思熟虑的方法形成基本了解的人。这些说明是为意大利比萨大学人工智能和数据工程硕士学位课程编写的。