Due to its geometric properties, hyperbolic space can support high-fidelity embeddings of tree- and graph-structured data, upon which various hyperbolic networks have been developed. Existing hyperbolic networks encode geometric priors not only for the input, but also at every layer of the network. This approach involves repeatedly mapping to and from hyperbolic space, which makes these networks complicated to implement, computationally expensive to scale, and numerically unstable to train. In this paper, we propose a simpler approach: learn a hyperbolic embedding of the input, then map once from it to Euclidean space using a mapping that encodes geometric priors by respecting the isometries of hyperbolic space, and finish with a standard Euclidean network. The key insight is to use a random feature mapping via the eigenfunctions of the Laplace operator, which we show can approximate any isometry-invariant kernel on hyperbolic space. Our method can be used together with any graph neural networks: using even a linear graph model yields significant improvements in both efficiency and performance over other hyperbolic baselines in both transductive and inductive tasks.
翻译:由于其几何特性, 双曲空间可以支持树形和图形结构数据的高度不折不扣嵌入, 并由此开发了各种双曲网络。 现有的双曲网络不仅为输入, 而且还为网络的每个层编码了几何前程。 这种方法涉及反复从双曲空间绘制图, 这使得这些网络实施复杂, 计算成本昂贵, 并且数字不稳定 。 在本文中, 我们提出一个更简单的方法 : 学习输入的双曲嵌入, 然后再用一个映射从它到 Eucliidean 空间, 地图通过尊重双曲空间的缩略图来编码几何前科, 完成标准的 Euclidean 网络 。 关键洞察是使用一个随机特征映射, 通过 Laplace 操作器的机根功能进行, 我们显示它可以对超偏移空间上的任何偏差- 的内核内核内核内核进行比较。 我们的方法可以与任何图形神经网络一起使用: 使用一个线性图形式的图形模型, 既能模型, 也使跨气压基线产生显著的改进。</s>