We prove an impossibility result, which in the context of function learning says the following: under certain conditions, it is impossible to simultaneously learn symmetries and functions equivariant under them using an ansatz consisting of equivariant functions. To formalize this statement, we carefully study notions of approximation for groups and semigroups. We analyze certain families of neural networks for whether they satisfy the conditions of the impossibility result: what we call ``linearly equivariant'' networks, and group-convolutional networks. A lot can be said precisely about linearly equivariant networks, making them theoretically useful. On the practical side, our analysis of group-convolutional neural networks allows us generalize the well-known ``convolution is all you need'' theorem to non-homogeneous spaces. We additionally find an important difference between group convolution and semigroup convolution.
翻译:在功能学习方面,我们证明不可能的结果,在功能学习方面说:在某些条件下,不可能同时使用由等同功能组成的ansatz 来学习对称和功能在它们下面的等同功能。为了正式确定这一声明,我们仔细研究群体和半群体近似概念。我们分析神经网络的某些家庭是否满足不可能结果的条件:我们称之为“线性等同网络”的网络和集团革命网络。很多关于线性等同网络的说法是准确的,使它们在理论上是有用的。在实际方面,我们对集团革命神经网络的分析使我们得以将众所周知的“进化”概念普遍化,这就是你所需要的非共产空间的理论。我们还发现集团革命和半集团革命之间的重要区别。