Recommender systems suffer from the cold-start problem whenever a new user joins the platform or a new item is added to the catalog. To address item cold-start, we propose to replace the embedding layer in sequential recommenders with a dynamic storage that has no learnable weights and can keep an arbitrary number of representations. In this paper, we present FELRec, a large embedding network that refines the existing representations of users and items in a recursive manner, as new information becomes available. In contrast to similar approaches, our model represents new users and items without side information or time-consuming fine-tuning. During item cold-start, our method outperforms similar method by 29.50%-47.45%. Further, our proposed model generalizes well to previously unseen datasets. The source code is publicly available at github.com/kweimann/FELRec.
翻译:当新用户加入平台或将新项目添加到目录中时,建议系统会遇到冷启动问题。为了处理冷启动项目,我们提议用一个没有可学习重量且可以任意保留陈述次数的动态存储器取代相继建议器中的嵌入层。在本文中,我们介绍一个大型嵌入网络FELRec,一个大型嵌入网络,随着新信息的出现,以循环的方式完善用户和项目的现有表达方式。与类似方法相反,我们的模型代表的是新的用户和项目,没有侧边信息或耗时的微调。在冷启动项目期间,我们的方法比类似方法高出29.50%-47.45 %。此外,我们提议的模型比以往的不可见数据集要宽泛得多。源代码在 Github.com/kweimann/FELRec 上公开提供。