Due to its geometric properties, hyperbolic space can support high-fidelity embeddings of tree- and graph-structured data. As a result, various hyperbolic networks have been developed which outperform Euclidean networks on many tasks: e.g. hyperbolic graph convolutional networks (GCN) can outperform vanilla GCN on some graph learning tasks. However, most existing hyperbolic networks are complicated, computationally expensive, and numerically unstable -- and they cannot scale to large graphs due to these shortcomings. With more and more hyperbolic networks proposed, it is becoming less and less clear what key component is necessary to make the model behave. In this paper, we propose HyLa, a simple and minimal approach to using hyperbolic space in networks: HyLa maps once from a hyperbolic-space embedding to Euclidean space via the eigenfunctions of the Laplacian operator in the hyperbolic space. We evaluate HyLa on graph learning tasks including node classification and text classification, where HyLa can be used together with any graph neural networks. When used with a linear model, HyLa shows significant improvements over hyperbolic networks and other baselines.
翻译:由于其几何特性,超曲空间可以支持树形和图形结构数据的高纤维嵌入。因此,已经开发出各种超双曲线网络,这些网络在很多任务上优于欧clidean网络:例如,双曲线图形卷变网络(GCN)在某些图形学习任务上优于香草GCN。然而,大多数现有的超曲线网络复杂,计算费用昂贵,数字不稳定,由于这些缺点,无法缩到大图表中。随着提议越来越多的双曲线网络,越来越不那么清楚模型行为需要什么关键组成部分。在本文件中,我们提议HyLa是一个使用双曲线空间的简单和最起码的方法:HyLa地图,曾经通过超曲线空间嵌入欧clidean空间,通过超曲线空间LaPlacian操作器操作器的顶功能。我们用图表学习任务评估HyLa,包括节线分类和文本分类,HyLa可以与任何图形神经网络一起使用。 当使用一个重要的双曲线基线时, 显示其他的基线性网络。