Modern neural networks have been successful in many regression-based tasks such as face recognition, facial landmark detection, and image generation. In this work, we investigate an intuitive but understudied characteristic of modern neural networks, namely, the nonsmoothness. The experiments using synthetic data confirm that such operations as ReLU and max pooling in modern neural networks lead to nonsmoothness. We quantify the nonsmoothness using a feature named the sum of the magnitude of peaks (SMP) and model the input-output relationships for building blocks of modern neural networks. Experimental results confirm that our model can accurately predict the statistical behaviors of the nonsmoothness as it propagates through such building blocks as the convolutional layer, the ReLU activation, and the max pooling layer. We envision that the nonsmoothness feature can potentially be used as a forensic tool for regression-based applications of neural networks.
翻译:现代神经网络在许多基于回归的任务中取得了成功,如面部识别、面部标志检测和图像生成等。 在这项工作中,我们调查现代神经网络的直觉但研究不足的特征,即非光滑性。使用合成数据进行的实验证实,ReLU和现代神经网络最大集合等操作导致非光滑性。我们用一个名为峰值总和的特征来量化非光滑性,并模拟现代神经网络构件的输入-输出关系。实验结果证实,我们的模型可以准确预测非光滑性的统计行为,因为它通过革命层、RELU激活和最大集合层等构件传播。我们设想,非光滑度特征可以用作以回归为基础的神经网络应用的法证工具。