Cold-start problem is a fundamental challenge for recommendation tasks. Despite the recent advances on Graph Neural Networks (GNNs) incorporate the high-order collaborative signal to alleviate the problem, the embeddings of the cold-start users and items aren't explicitly optimized, and the cold-start neighbors are not dealt with during the graph convolution in GNNs. This paper proposes to pre-train a GNN model before applying it for recommendation. Unlike the goal of recommendation, the pre-training GNN simulates the cold-start scenarios from the users/items with sufficient interactions and takes the embedding reconstruction as the pretext task, such that it can directly improve the embedding quality and can be easily adapted to the new cold-start users/items. To further reduce the impact from the cold-start neighbors, we incorporate a self-attention-based meta aggregator to enhance the aggregation ability of each graph convolution step, and an adaptive neighbor sampler to select the effective neighbors according to the feedbacks from the pre-training GNN model. Experiments on three public recommendation datasets show the superiority of our pre-training GNN model against the original GNN models on user/item embedding inference and the recommendation task.
翻译:冷点启动问题是建议任务的根本挑战。 尽管最近在图形神经网络(GNN)上取得了进步,但是,尽管在图形神经网络(GNN)上出现了缓解问题的高端合作信号,但冷点启动用户和物品的嵌入并未明显优化,而且冷点启动邻居在GNN的图形变幻过程中没有被处理。本文件提议在应用GNN模型之前先对GNN模型进行预培训。 与建议的目的不同的是,培训前GNNN模拟用户/项目的冷点启动情景,并有足够的互动,将嵌入的重建作为借口任务,以便直接改进嵌入质量,便于适应新的冷点启动用户/项目。为了进一步减少冷点启动邻居的影响,我们采用了一个基于自我注意的元集成器,以提高每个图形革命步骤的汇总能力,以及一个适应性邻居样板,以便根据培训前GNNN模式的反馈选择有效的邻居。在三个公共建议数据集上进行实验,显示我们培训前GNNN的模型优于原G用户/模型的嵌入。