Invariant and equivariant networks are useful in learning data with symmetry, including images, sets, point clouds, and graphs. In this paper, we consider invariant and equivariant networks for symmetries of finite groups. Invariant and equivariant networks have been constructed by various researchers using Reynolds operators. However, Reynolds operators are computationally expensive when the order of the group is large because they use the sum over the whole group, which poses an implementation difficulty. To overcome this difficulty, we consider representing the Reynolds operator as a sum over a subset instead of a sum over the whole group. We call such a subset a Reynolds design, and an operator defined by a sum over a Reynolds design a reductive Reynolds operator. For example, in the case of a graph with $n$ nodes, the computational complexity of the reductive Reynolds operator is reduced to $O(n^2)$, while the computational complexity of the Reynolds operator is $O(n!)$. We construct learning models based on the reductive Reynolds operator called equivariant and invariant Reynolds networks (ReyNets) and prove that they have universal approximation property. Reynolds designs for equivariant ReyNets are derived from combinatorial observations with Young diagrams, while Reynolds designs for invariant ReyNets are derived from invariants called Reynolds dimensions defined on the set of invariant polynomials. Numerical experiments show that the performance of our models is comparable to state-of-the-art methods.
翻译:变化和变异网络有助于以对称性( 包括图像、 数据集、 点云和图表) 学习数据。 在本文中, 我们考虑对称性( 包括图像、 设置、 点云和图表) 。 我们考虑对称性( ) 。 不同研究人员使用 Reynolds 操作员建造了异变和异变网络。 但是, 当该组的顺序大时, Reynolds 操作员计算成本昂贵, 因为他们使用整个组的总和, 从而造成执行困难 。 为了克服这一困难, 我们考虑将 Reynolds 操作员作为子集的总和, 而不是整个组的总和。 我们称之为 Reynolds 设计, 由对等的子和异变异性( ) 网络定义一个操作员。 例如, 以美元为零的图表, Renoldines 操作员的计算复杂性降低到$O( n2) 美元, 而 Renoldisal 操作员的计算复杂性是美元。 我们根据 Restal developal 的 Restal developmental developmental comstations 的 Restal 设计中, 。