Many Graph Neural Networks (GNNs) are proposed for KG embedding. However, lots of these methods neglect the importance of the information of relations and combine it with the information of entities inefficiently and mostly additively, leading to low expressiveness. To address this issue, we introduce a general knowledge graph encoder incorporating tensor decomposition in the aggregation function of Relational Graph Convolutional Network (R-GCN). In our model, the parameters of a low-rank core projection tensor, used to transform neighbor entities, are shared across relations to benefit from multi-task learning and produce expressive relation-aware representations. Besides, we propose a low-rank estimation of the core tensor using CP decomposition to compress the model, which is also applicable, as a regularization method, to other similar GNNs. We train our model using a contrastive loss, which relieves the training limitation of the 1-N method on huge graphs. We achieved favorably competitive results on FB15-237 and WN18RR with embeddings in comparably lower dimensions; particularly, we improved R-GCN performance on FB15-237 by 36% with the same decoder.
翻译:在KG嵌入中,许多这些方法忽视了关系信息的重要性,将其与实体的信息混为一谈,导致表达性低。为了解决这一问题,我们引入了通用知识图形编码器,将高压分解纳入关系图变动网络(R-GCN)的聚合功能中。在模型中,用于改造邻国实体的低级核心投影阵列参数在各种关系中共享,以获益于多任务学习,并产生直观关系觉的表达式。此外,我们建议采用CP分解法对核心阵列进行低等级估计,以压缩该模型,作为一种正规化方法,也适用于其他类似的GNNS。我们用对比性损失来培训我们的模型,这减轻了用于改造邻国实体的1-N方法在巨型图上的培训限制。我们在FB15-237和WN18RR上取得了有利的竞争结果,同时将FB15-237和WN18R与可比较低的尺寸嵌入在一起;特别是,我们改进了R-G%在FB37上的类似业绩。