Neural networks are an indispensable model class for many complex learning tasks. Despite the popularity and importance of neural networks and many different established techniques from literature for stabilization and robustification of the training, the classical concepts from robust statistics have rarely been considered so far in the context of neural networks. Therefore, we adapt the notion of the regression breakdown point to regression neural networks and compute the breakdown point for different feed-forward network configurations and contamination settings. In an extensive simulation study, we compare the performance, measured by the out-of-sample loss, by a proxy of the breakdown rate and by the training steps, of non-robust and robust regression feed-forward neural networks in a plethora of different configurations. The results indeed motivate to use robust loss functions for neural network training.
翻译:神经网络是许多复杂学习任务不可或缺的模型类。尽管神经网络以及文献中许多不同的既定技术对稳定和巩固培训具有普遍意义和重要性,但迄今为止,在神经网络的背景下,很少考虑来自可靠统计数据的经典概念,因此,我们将回归分解点的概念调整为倒退神经网络,并计算不同进料前网络配置和污染环境的分解点。在一项广泛的模拟研究中,我们比较了以异常损失、分解率的代用率和培训步骤、不同配置的无紫色和稳健的回归向前神经网络等衡量的性能。结果确实促使在神经网络培训中使用稳健的损失功能。