We develop a theory of category-equivariant neural networks (CENNs) that unifies group/groupoid-equivariant networks, poset/lattice-equivariant networks, graph and sheaf neural networks. Equivariance is formulated as naturality in a topological category with Radon measures. Formulating linear and nonlinear layers in the categorical setup, we prove the equivariant universal approximation theorem in the general setting: the class of finite-depth CENNs is dense in the space of continuous equivariant transformations. We instantiate the framework for groups/groupoids, posets/lattices, graphs and cellular sheaves, deriving universal approximation theorems for them in a systematic manner. Categorical equivariant deep learning thus allows us to expand the horizons of equivariant deep learning beyond group actions, encompassing not only geometric symmetries but also contextual and compositional symmetries.
翻译:我们发展了一套范畴等变神经网络(CENNs)理论,该理论统一了群/广群等变网络、偏序集/格等变网络、图神经网络以及层神经网络。等变性被表述为在具有拉东测度的拓扑范畴中的自然性。通过在范畴框架中构建线性和非线性层,我们在一般性设定下证明了等变通用逼近定理:有限深度CENNs的类别在连续等变变换的空间中是稠密的。我们针对群/广群、偏序集/格、图以及胞腔层具体实例化了该框架,从而系统性地推导出它们的通用逼近定理。因此,范畴等变深度学习使我们能够将等变深度学习的视野扩展到群作用之外,不仅涵盖几何对称性,还涵盖上下文和组合对称性。