We propose layer saturation - a simple, online-computable method for analyzing the information processing in neural networks. First, we show that a layer's output can be restricted to the eigenspace of its variance matrix without performance loss. We propose a computationally lightweight method for approximating the variance matrix during training. From the dimension of its lossless eigenspace we derive layer saturation - the ratio between the eigenspace dimension and layer width. We show that saturation seems to indicate which layers contribute to network performance. We demonstrate how to alter layer saturation in a neural network by changing network depth, filter sizes and input resolution. Furthermore, we show that well-chosen input resolution increases network performance by distributing the inference process more evenly across the network.
翻译:我们提出层饱和度,这是分析神经网络信息处理的简单、在线可计算的方法。 首先,我们显示层的输出可限于其差异矩阵的天体,不造成性能损失。 我们提出在训练期间与差异矩阵相近的计算轻型方法。 我们从无损的天体的维度中得出层饱和度—— 空气空间维度与层宽度之比。 我们显示,饱和度似乎表明哪些层有助于网络的性能。 我们通过改变网络深度、过滤尺寸和输入分辨率来展示如何改变神经网络的层饱和度。 此外,我们通过在网络中更均衡地分配推断过程来显示,选取的输入分辨率提高了网络的性能。