Knowledge graph embedding plays an important role in knowledge representation, reasoning, and data mining applications. However, for multiple cross-domain knowledge graphs, state-of-the-art embedding models cannot make full use of the data from different knowledge domains while preserving the privacy of exchanged data. In addition, the centralized embedding model may not scale to the extensive real-world knowledge graphs. Therefore, we propose a novel decentralized scalable learning framework, \emph{Federated Knowledge Graphs Embedding} (FKGE), where embeddings from different knowledge graphs can be learnt in an asynchronous and peer-to-peer manner while being privacy-preserving. FKGE exploits adversarial generation between pairs of knowledge graphs to translate identical entities and relations of different domains into near embedding spaces. In order to protect the privacy of the training data, FKGE further implements a privacy-preserving neural network structure to guarantee no raw data leakage. We conduct extensive experiments to evaluate FKGE on 11 knowledge graphs, demonstrating a significant and consistent improvement in model quality with at most 17.85\% and 7.90\% increases in performance on triple classification and link prediction tasks.
翻译:知识嵌入图在知识代表、推理和数据挖掘应用中起着重要作用。然而,对于多个跨域知识图形,最先进的嵌入模型无法充分利用不同知识领域的数据,同时保护交换数据的隐私。此外,中央嵌入模型可能无法推广到广泛的真实世界知识图中。因此,我们提议了一个新的分散式可缩放学习框架, emmph{FIread Inform Inform Intal Information 图形嵌入(FKGE ), 从不同知识图中嵌入的嵌入可以以非同步和对等方式学习,同时保持隐私。 FKGE 利用对对立式知识图的生成,将相同实体和不同领域的关系转换到接近嵌入空间。为了保护培训数据的隐私, FKGE 进一步实施一个隐私保存神经网络结构,以保证没有原始数据泄漏。我们进行了广泛的实验,对11个知识图表进行了评估FKGEGE,显示模型质量与大多数17.85 ⁇ 和7.90的三联式任务业绩联系有显著和一致的改进。