Knowledge graph embedding, which projects symbolic entities and relations into continuous vector spaces, is gaining increasing attention. Previous methods allow a single static embedding for each entity or relation, ignoring their intrinsic contextual nature, i.e., entities and relations may appear in different graph contexts, and accordingly, exhibit different properties. This work presents Contextualized Knowledge Graph Embedding (CoKE), a novel paradigm that takes into account such contextual nature, and learns dynamic, flexible, and fully contextualized entity and relation embeddings. Two types of graph contexts are studied: edges and paths, both formulated as sequences of entities and relations. CoKE takes a sequence as input and uses a Transformer encoder to obtain contextualized representations. These representations are hence naturally adaptive to the input, capturing contextual meanings of entities and relations therein. Evaluation on a wide variety of public benchmarks verifies the superiority of CoKE in link prediction and path query answering. It performs consistently better than, or at least equally well as current state-of-the-art in almost every case, in particular offering an absolute improvement of 19.7% in H@10 on path query answering. Our code is available at \url{https://github.com/paddlepaddle/models/tree/develop/PaddleKG/CoKE}.
翻译:知识嵌入式(CoKE)是一个新颖的范例,它考虑到这种背景性质,并学习动态、灵活和充分背景化的实体和嵌入关系。研究了两种类型的图表背景环境:边缘和路径,两者都是实体的序列和关系。 CoKE采用一个序列作为输入程序,并使用变换器编码器获得背景化的表述。因此,这些表达方式自然地适应了输入内容,捕捉了实体的背景含义和其中的关系。对各种各样的公共基准的评价可以验证CoKE在链接预测和路径查询回答方面的优势。我们的代码在各种公共基准上可以验证CoKE在连接预测和路径查询方面的优势。它比几乎每个案例都表现得更好,或者至少同样地表现为当前状态的艺术,特别是在路径解析中H@10提供19.7%的绝对改进。我们的代码可以在Gglib/Dreadroad K/CAmble/Cogle/Dread/GGGGGDRQ_BA/DRQ/GGGGDRQ/DRQQQ/GGGDLDRQQQQQQQQ/GGGDBADRQQQQQQ/GDRDRDRDRDRDRDRDRQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQQDRBD) 中可以提供的代码。