Using reviews to learn user and item representations is important for recommender system. Current review based methods can be divided into two categories: (1) the Convolution Neural Network (CNN) based models that extract n-gram features from user/item reviews; (2) the Recurrent Neural Network (RNN) based models that learn global contextual representations from reviews for users and items. Despite their success, both CNN and RNN based models in previous studies suffer from their own drawbacks. While CNN based models are weak in modeling long-dependency relation in text, RNN based models are slow in training and inference due to their incapability with parallel computing. To alleviate these problems, we propose a new text encoder module for review modeling in recommendation by combining convolution networks with self-attention networks to model local and global interactions in text together.As different words, sentences, reviews have different importance for modeling user and item representations, we construct review models hierarchically in sentence-level, review-level, and user/item level by encoding words for sentences, encoding sentences for reviews, and encoding reviews for user and item representations. Experiments on Amazon Product Benchmark show that our model can achieve significant better performance comparing to the state-of-the-art review based recommendation models.
翻译:目前基于审查的方法可以分为两类:(1) 以革命神经网络(CNN)为基础的模型,从用户/项目审查中提取正克特征;(2) 以神经网络(RNN)为基础的经常性模型,从用户和项目审查中学习全球背景表述;(2) 以神经网络(RNN)为基础的经常性模型,从用户和项目审查中学习全球背景表述;尽管这些模型取得成功,但以往研究中以CNN和RNNN的模型都有其自身的缺点。尽管以CNN和RNN为基础的模型在模拟文本的长期依赖关系方面薄弱,但基于CNNNNN的模型在培训和推断方面进展缓慢,因为它们与平行计算不易。为了缓解这些问题,我们提出了一个新的文本编码编码模块模块,用于审查建议模式,将革命网络与自我关注网络结合起来,以模拟文本中的当地和全球互动。