Many Graph Neural Networks (GNNs) are proposed for Knowledge Graph Embedding (KGE). However, lots of these methods neglect the importance of the information of relations and combine it with the information of entities inefficiently, leading to low expressiveness. To address this issue, we introduce a general knowledge graph encoder incorporating tensor decomposition in the aggregation function of Relational Graph Convolutional Network (R-GCN). In our model, neighbor entities are transformed using projection matrices of a low-rank tensor which are defined by relation types to benefit from multi-task learning and produce expressive relation-aware representations. Besides, we propose a low-rank estimation of the core tensor using CP decomposition to compress and regularize our model. We use a training method inspired by contrastive learning, which relieves the training limitation of the 1-N method on huge graphs. We achieve favorably competitive results on FB15k-237 and WN18RR with embeddings in comparably lower dimensions.
翻译:暂无翻译