This paper shows that a simple baseline based on a Bag-of-Words (BoW) representation learns surprisingly good knowledge graph embeddings. By casting knowledge base completion and question answering as supervised classification problems, we observe that modeling co-occurences of entities and relations leads to state-of-the-art performance with a training time of a few minutes using the open sourced library fastText.
翻译:本文显示,基于一袋Words(BoW)代表的简单基线可以学习出惊人的好知识图表嵌入。 通过投递知识库完成和回答作为监管分类问题的问答,我们观察到,实体和关系共同存在的模型化导致使用开放源库快图,以几分钟的培训时间进行最先进的性能表现。