We derive conditions for the existence of fixed points of nonnegative neural networks, an important research objective to understand the behavior of neural networks in modern applications involving autoencoders and loop unrolling techniques, among others. In particular, we show that neural networks with nonnegative inputs and nonnegative parameters can be recognized as monotonic and (weakly) scalable functions within the framework of nonlinear Perron-Frobenius theory. This fact enables us to derive conditions for the existence of a nonempty fixed point set of the nonnegative neural networks, and these conditions are weaker than those obtained recently using arguments in convex analysis, which are typically based on the assumption of nonexpansivity of the activation functions. Furthermore, we prove that the shape of the fixed point set of monotonic and weakly scalable neural networks is often an interval, which degenerates to a point for the case of scalable networks. The chief results of this paper are verified in numerical simulations, where we consider an autoencoder-type network that first compresses angular power spectra in massive MIMO systems, and, second, reconstruct the input spectra from the compressed signals.
翻译:我们在非线性 Perron-Frobenius 理论框架内为非阴性神经网络固定点的存在创造条件,这是了解神经网络在现代应用中的行为的一个重要研究目标,这些应用涉及自动读数器和循环无滚动技术,特别是,我们表明,具有非负性投入和非负性参数的神经网络在非线性 Perron-Frobenius 理论框架内可以被确认为单调和(微弱)可缩放功能。这一事实使我们能够为非阴性神经网络固定点的存在创造条件,而这些条件比最近利用对等离子分析中的论点而获得的条件要弱,这些论点通常基于对激活功能的非爆炸性假设。此外,我们证明,单调和微弱可缩缩缩缩缩缩动神经网络固定点的形状往往是一个间隔,它退化到可缩放网络的临界点。本文的主要结果通过数字模拟得到验证,我们在此过程中考虑的是首先从磁质变压的磁力光谱、大规模磁质系统和磁质变压的同步信号中,第二个自动的网络。