Equivariance of linear neural network layers is well studied. In this work, we relax the equivariance condition to only be true in a projective sense. In particular, we study the relation of projective and ordinary equivariance and show that for important examples, the problems are in fact equivalent. The rotation group in 3D acts projectively on the projective plane. We experimentally study the practical importance of rotation equivariance when designing networks for filtering 2D-2D correspondences. Fully equivariant models perform poorly, and while a simple addition of invariant features to a strong baseline yields improvements, this seems to not be due to improved equivariance.
翻译:仔细研究了线性神经网络层的等同性。在这项工作中,我们放宽了等同性条件,只在投影意义上是真实的。特别是,我们研究了投影和普通等同性之间的关系,并表明对重要的例子来说,问题其实是相当的。三维的轮流小组在投影平面上投影。我们在设计过滤 2D-2D 通信的网络时,实验性地研究轮调等同性的实际重要性。完全等同性模型运行不良,而只是将变异性特性添加到强大的基准产量提高之外,这似乎并非因为变异性改善。