In this paper, we introduce a new class of functions on $\mathbb{R}$ that is closed under composition, and contains the logistic sigmoid function. We use this class to show that any 1-dimensional neural network of arbitrary depth with logistic sigmoid activation functions has at most three fixed points. While such neural networks are far from real world applications, we are able to completely understand their fixed points, providing a foundation to the much needed connection between application and theory of deep neural networks.
翻译:在本文中,我们介绍了一个包含逻辑sigmoid函数的新函数类,该类函数在组合下是封闭的。我们利用这个类来证明,任何具有逻辑sigmoid激活函数的任意深度的一维神经网络最多只有三个不动点。虽然这样的神经网络远离现实世界的应用,但我们能够完全理解它们的不动点,这为深度神经网络应用与理论之间的必要联系奠定了基础。