We present the group equivariant conditional neural process (EquivCNP), a meta-learning method with permutation invariance in a data set as in conventional conditional neural processes (CNPs), and it also has transformation equivariance in data space. Incorporating group equivariance, such as rotation and scaling equivariance, provides a way to consider the symmetry of real-world data. We give a decomposition theorem for permutation-invariant and group-equivariant maps, which leads us to construct EquivCNPs with an infinite-dimensional latent space to handle group symmetries. In this paper, we build architecture using Lie group convolutional layers for practical implementation. We show that EquivCNP with translation equivariance achieves comparable performance to conventional CNPs in a 1D regression task. Moreover, we demonstrate that incorporating an appropriate Lie group equivariance, EquivCNP is capable of zero-shot generalization for an image-completion task by selecting an appropriate Lie group equivariance.
翻译:我们展示了群体等同有条件神经过程(EquivCNP),这是一种在常规有条件神经过程(CNP)等数据集中具有变异性的元学习方法,在传统的有条件神经过程(CNP)等数据集中具有变异性,在数据空间中也有变异性。纳入群体等变异性,例如轮用和缩放等等异性,为考虑真实世界数据的对称性提供了一种方法。我们给出了一个用于变异-变异和群异性地图的变异性理论的分解性,这导致我们用一个无限的潜质空间来构建 EquivCNPs,用于处理群配对性。在本文中,我们用利组相变异性层来构建结构,以实际实施。我们显示,在 1D 回归任务中,具有变异性的EquivCNP取得了与常规CNP的类似性能。此外,我们证明,将适当的利组变异性纳入适当的利组,EquivCNP能够通过选择适当的变异性来对完成图像的任务进行零射一般化。