Is a sample rich enough to determine, at least locally, the parameters of a neural network? To answer this question, we introduce a new local parameterization of a given deep ReLU neural network by fixing the values of some of its weights. This allows us to define local lifting operators whose inverses are charts of a smooth manifold of a high dimensional space. The function implemented by the deep ReLU neural network composes the local lifting with a linear operator which depends on the sample. We derive from this convenient representation a geometrical necessary and sufficient condition of local identifiability. Looking at tangent spaces, the geometrical condition provides: 1/ a sharp and testable necessary condition of identifiability and 2/ a sharp and testable sufficient condition of local identifiability. The validity of the conditions can be tested numerically using backpropagation and matrix rank computations.
翻译:样本是否足够丰富, 至少在当地确定神经网络的参数? 为了回答这个问题, 我们通过固定某些重量值, 引入了给定的深ReLU神经网络的新的本地参数化。 这使我们能够定义本地升降操作器, 其反面是高维空间光滑的方位图。 深ReLU神经网络所执行的功能是本地升降操作器, 其直线操作器取决于样本。 我们从这个方便的表示中得出了一种对地测量必要和充分的本地可识别性条件。 以正切空间为例, 几何条件提供了: 1 一种清晰和可测试的可识别性必要条件, 2 以及 2 一种精确和可测试的本地可识别性充分条件。 条件的有效性可以通过反向调整和矩阵级计算来进行数字测试。