Feed-forward networks can be interpreted as mappings with linear decision surfaces at the level of the last layer. We investigate how the tangent space of the network can be exploited to refine the decision in case of ReLU (Rectified Linear Unit) activations. We show that a simple Riemannian metric parametrized on the parameters of the network forms a similarity function at least as good as the original network and we suggest a sparse metric to increase the similarity gap.
翻译:Feed-forward 网络可被解释为在最后一层水平上绘制线性决定表面的绘图。 我们调查如何利用网络的正切空间来完善在ReLU(精密线性单位)启动情况下的决定。 我们发现,简单、里曼尼的网络参数参数的定量对称至少形成与原始网络相同的类似功能,我们建议采用稀疏的测量方法来增加相似性差距。