Modern recommender systems (RS) work by processing a number of signals that can be inferred from large sets of user-item interaction data. The main signal to analyze stems from the raw matrix that represents interactions. However, we can increase the performance of RS by considering other kinds of signals like the context of interactions, which could be, for example, the time or date of the interaction, the user location, or sequential data corresponding to the historical interactions of the user with the system. These complex, context-based interaction signals are characterized by a rich relational structure that can be represented by a multi-partite graph. Graph Convolutional Networks (GCNs) have been used successfully in collaborative filtering with simple user-item interaction data. In this work, we generalize the use of GCNs for N-partite graphs by considering N multiple context dimensions and propose a simple way for their seamless integration in modern deep learning RS architectures. More specifically, we define a graph convolutional embedding layer for N-partite graphs that processes user-item-context interactions, and constructs node embeddings by leveraging their relational structure. Experiments on several datasets from recommender systems to drug re-purposing show the benefits of the introduced GCN embedding layer by measuring the performance of different context-enriched tasks.
翻译:现代建议系统(RS)通过处理大量用户-项目互动数据可以推断出的一系列信号,分析的主要信号来自代表互动的原始矩阵。然而,我们可以通过考虑互动背景等其他类型的信号,例如互动的时间或日期、用户位置或与用户与系统的历史互动相对应的顺序数据,提高RS的性能。这些复杂、基于背景的互动信号的特点是具有丰富的关系结构,可以通过多部分图加以代表。图表相联网络(GCN)已被成功地用于与简单的用户-项目互动数据合作过滤。在这项工作中,我们通过考虑N多个背景层面来推广将GCN用于N部分图的通用性能,并提议一个简单的方法,将它们与用户与系统与系统与系统之间的历史互动相对应相对应。更具体地说,我们为N部分的图表定义了一个图表递增层嵌入层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层层