We develop a theory of category-equivariant neural networks (CENNs) that unifies group/groupoid-equivariant networks, poset/lattice-equivariant networks, graph and sheaf neural networks. Equivariance is formulated as naturality in a topological category with Radon measures, formulating linear and nonlinear layers in the categorical setup. We prove the equivariant universal approximation theorem in the general setting: the class of finite-depth CENNs is dense in the space of continuous equivariant transformations. We instantiate the framework for groups/groupoids, posets/lattices, graphs and cellular sheaves, deriving universal approximation theorems for them in a systematic manner. Categorical equivariant deep learning thus allows us to expand the horizons of equivariant deep learning beyond group actions, encompassing not only geometric symmetries but also contextual and compositional symmetries.
翻译:我们发展了一套范畴等变神经网络(CENNs)的理论,该理论统一了群/广群等变网络、偏序集/格等变网络、图与层神经网络。等变性被表述为具有拉东测度的拓扑范畴中的自然性,从而在范畴框架下形式化线性和非线性层。我们证明了广义设置下的等变通用逼近定理:有限深度CENNs的类别在连续等变变换空间中稠密。我们针对群/广群、偏序集/格、图及胞腔层具体实例化该框架,以系统化方式推导出它们的通用逼近定理。因此,范畴等变深度学习使我们能够将等变深度学习的视野扩展到群作用之外,不仅涵盖几何对称性,还包括上下文与组合对称性。