Hyperbolic neural networks have been popular in the recent past due to their ability to represent hierarchical data sets effectively and efficiently. The challenge in developing these networks lies in the nonlinearity of the embedding space namely, the Hyperbolic space. Hyperbolic space is a homogeneous Riemannian manifold of the Lorentz group. Most existing methods (with some exceptions) use local linearization to define a variety of operations paralleling those used in traditional deep neural networks in Euclidean spaces. In this paper, we present a novel fully hyperbolic neural network which uses the concept of projections (embeddings) followed by an intrinsic aggregation and a nonlinearity all within the hyperbolic space. The novelty here lies in the projection which is designed to project data on to a lower-dimensional embedded hyperbolic space and hence leads to a nested hyperbolic space representation independently useful for dimensionality reduction. The main theoretical contribution is that the proposed embedding is proved to be isometric and equivariant under the Lorentz transformations. This projection is computationally efficient since it can be expressed by simple linear operations, and, due to the aforementioned equivariance property, it allows for weight sharing. The nested hyperbolic space representation is the core component of our network and therefore, we first compare this ensuing nested hyperbolic space representation with other dimensionality reduction methods such as tangent PCA, principal geodesic analysis (PGA) and HoroPCA. Based on this equivariant embedding, we develop a novel fully hyperbolic graph convolutional neural network architecture to learn the parameters of the projection. Finally, we present experiments demonstrating comparative performance of our network on several publicly available data sets.
翻译:最近,超球神经网络由于能够有效和高效地代表等级数据集而最近才受到欢迎。 开发这些网络的挑战在于嵌入空间的非线性, 即超曲空间。 超曲空间是Lorentz 组中一个同质的里格曼式方程式。 大多数现有方法( 有一些例外) 使用本地线性化来定义与传统深神经网络在Euclidean空间中使用的平行操作平行的各种操作。 在本文中, 我们展示了一个新的完全超曲级神经网络网络, 使用预测的概念( 嵌入) 之后的内在聚合和非线性参数。 此处的新颖空间空间空间空间空间是一个非线性空间空间网络的无线性能。 因此, 将数据投射到一个低度嵌入的超运动空间空间代表器上, 使得我们目前的一些嵌入在Lorentz 变换中被证明为对等值和等异性。 这种投影显示在计算中是有效的, 可以通过简单的直线性运行操作和非线性网络的比较 。 最终, 将我们开始的内置的内置的内置的内基数据结构, 学习了我们目前的内存数据结构,, 我们的内存的内存的内存的内存的内存的内存, 。