Graph neural networks (GNN) have been successful in many fields, and derived various researches and applications in real industries. However, in some privacy sensitive scenarios (like finance, healthcare), training a GNN model centrally faces challenges due to the distributed data silos. Federated learning (FL) is a an emerging technique that can collaboratively train a shared model while keeping the data decentralized, which is a rational solution for distributed GNN training. We term it as federated graph learning (FGL). Although FGL has received increasing attention recently, the definition and challenges of FGL is still up in the air. In this position paper, we present a categorization to clarify it. Considering how graph data are distributed among clients, we propose four types of FGL: inter-graph FL, intra-graph FL and graph-structured FL, where intra-graph is further divided into horizontal and vertical FGL. For each type of FGL, we make a detailed discussion about the formulation and applications, and propose some potential challenges.
翻译:电子神经网络(GNN)在许多领域都取得了成功,在现实产业中产生了各种研究和应用。然而,在某些隐私敏感的情况下(如金融、保健),培训GNN模型由于分布的数据筒仓而面临挑战。联邦学习(FL)是一种新兴技术,可以合作培训一个共享模型,同时保持数据分散化,这是分布式GNN培训的合理解决办法。我们把它称为联合图形学习(FGL)。虽然FGL最近受到越来越多的关注,但FGL的定义和挑战仍然在空中。我们在本立场文件中提出一个分类,以澄清它。考虑到图表数据如何在客户之间分配,我们建议四种FGL类型:Intergraphy FL、Ingraph FL和图形结构FL,其中内部数据进一步分为横向和纵向FGL。关于FGL的每一种类型,我们详细讨论FGL的拟订和应用,并提出一些潜在挑战。