The application of deep learning to symbolic domains remains an active research endeavour. Graph neural networks (GNN), consisting of trained neural modules which can be arranged in different topologies at run time, are sound alternatives to tackle relational problems which lend themselves to graph representations. In this paper, we show that GNNs are capable of multitask learning, which can be naturally enforced by training the model to refine a single set of multidimensional embeddings $\in \mathbb{R}^d$ and decode them into multiple outputs by connecting MLPs at the end of the pipeline. We demonstrate the multitask learning capability of the model in the relevant relational problem of estimating network centrality measures, i.e. is vertex $v_1$ more central than vertex $v_2$ given centrality $c$?. We then show that a GNN can be trained to develop a $lingua$ $franca$ of vertex embeddings from which all relevant information about any of the trained centrality measures can be decoded. The proposed model achieves $89\%$ accuracy on a test dataset of random instances with up to 128 vertices and is shown to generalise to larger problem sizes. The model is also shown to obtain reasonable accuracy on a dataset of real world instances with up to 4k vertices, vastly surpassing the sizes of the largest instances with which the model was trained ($n=128$). Finally, we believe that our contributions attest to the potential of GNNs in symbolic domains in general and in relational learning in particular.
翻译:将深层次学习应用于象征性领域仍是一项积极的研究工作。 128 由经过培训的神经模块组成的神经网络(GNNN)由经过培训的神经模块组成,可以按运行时的不同地形排列,是解决关系问题的可靠替代方法,有助于图形表达。 在本文中,我们显示GNN能够多任务学习,这可以通过培训模型来自然实施,以完善单一的一套多层面嵌入$\ in\mathbb{R ⁇ d$,并将它们解译为多项产出,在管道结束时将任何经过培训的确定核心措施连接到 MLPs。 在估算网络中心度措施的相关关系问题中,我们展示了该模型的多任务学习能力,即: verex $v_1$1$, 高于vertex $v_2$, 以核心美元为中心? 然后,我们显示一个GNNN可以被训练的组合模型, 用于开发一个 $lingualua, $franca$, 并由此可以解析关于任何经过培训的确定核心措施的任何有关信息。 。 拟议的模型在测试中, 在测试中, roflex lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax