We show that for neural network functions that have width less or equal to the input dimension all connected components of decision regions are unbounded. The result holds for continuous and strictly monotonic activation functions as well as for the ReLU activation function. This complements recent results on approximation capabilities by [Hanin 2017 Approximating] and connectivity of decision regions by [Nguyen 2018 Neural] for such narrow neural networks. Our results are illustrated by means of numerical experiments.
翻译:我们显示,对于宽度小于或等于输入维度的神经网络功能,决策区域的所有连接部分都是不受约束的。结果将保持连续和严格的单声振动功能以及RELU激活功能。这补充了最近通过[2017年哈宁近距离]获得的近距离能力结果和通过[2018年Nguyen神经]获得的这种狭窄神经网络决策区域的连通性结果。我们的结果通过数字实验来说明。