Deep Neural Networks (DNNs) are widely used for their ability to effectively approximate large classes of functions. This flexibility, however, makes the strict enforcement of constraints on DNNs an open problem. Here we present a framework that, under mild assumptions, allows the exact enforcement of constraints on parameterized sets of functions such as DNNs. Instead of imposing "soft'' constraints via additional terms in the loss, we restrict (a subset of) the DNN parameters to a submanifold on which the constraints are satisfied exactly throughout the entire training procedure. We focus on constraints that are outside the scope of equivariant networks used in Geometric Deep Learning. As a major example of the framework, we restrict filters of a Convolutional Neural Network (CNN) to be wavelets, and apply these wavelet networks to the task of contour prediction in the medical domain.
翻译:深神经网络(DNN)被广泛用于有效估计大种类功能的能力。然而,这种灵活性使得严格强制实施对DNN的限制成为了一个公开的问题。我们在这里提出了一个框架,根据一些轻度假设,允许对DNN等参数化功能组进行严格强制。我们没有通过额外损失条件施加“软性”限制,而是将DNN参数的子部分限制在在整个培训过程中完全满足了这些限制的子部分。我们把重点放在几何深学习中所使用的等同网络范围之外的制约因素上。作为这个框架的一个主要例子,我们限制一个动态神经网络的过滤器成为波子,并将这些波子网络用于医学领域的轮廓预测任务。