It is well-known that the parameterized family of functions representable by fully-connected feedforward neural networks with ReLU activation function is precisely the class of piecewise linear functions with finitely many pieces. It is less well-known that for every fixed architecture of ReLU neural network, the parameter space admits positive-dimensional spaces of symmetries, and hence the local functional dimension near any given parameter is lower than the parametric dimension. In this work we carefully define the notion of functional dimension, show that it is inhomogeneous across the parameter space of ReLU neural network functions, and continue an investigation - initiated in [14] and [5] - into when the functional dimension achieves its theoretical maximum. We also study the quotient space and fibers of the realization map from parameter space to function space, supplying examples of fibers that are disconnected, fibers upon which functional dimension is non-constant, and fibers upon which the symmetry group acts non-transitively.
翻译:众所周知,由具有 ReLU 激活功能的完全连接的 feedforward 神经网络所代表功能的参数化函数组合,恰恰是带有数量有限的碎片的片断线性函数类别。对于ReLU 神经网络的每一个固定结构,参数空间都接受正维的对称空间,因此,任何特定参数附近的局部功能层面都低于参数维度。在这项工作中,我们仔细界定了功能层面的概念,表明该功能在 ReLU 神经网络功能的参数空间之间是不相容的,并在功能层面达到理论最大化时继续调查(在[14] 和 [5] 启动)。我们还研究了从参数空间到功能空间的实现图的商空间和纤维,提供了互不相连的纤维、功能层面不相连的纤维和对称组不变性的纤维的例子。