We derive conditions for the existence of fixed points of cone mappings without assuming scalability of functions. Monotonicity and scalability are often inseparable in the literature in the context of searching for fixed points of interference mappings. In applications, such mappings are approximated by non-negative neural networks. It turns out, however, that the process of training non-negative networks requires imposing an artificial constraint on the weights of the model. However, in the case of specific non-negative data, it cannot be said that if the mapping is non-negative, it has only non-negative weights. Therefore, we considered the problem of the existence of fixed points for general neural networks, assuming the conditions of tangency conditions with respect to specific cones. This does not relax the physical assumptions, because even assuming that the input and output are to be non-negative, the weights can have (small, but) less than zero values. Such properties (often found in papers on the interpretability of weights of neural networks) lead to the weakening of the assumptions about the monotonicity or scalability of the mapping associated with the neural network. To the best of our knowledge, this paper is the first to study this phenomenon.
翻译:我们得出存在固定的锥形绘图点的条件,但不假定功能的可缩放性。在搜索固定干扰绘图点时,单调性和可缩放性在文献中往往是不可分割的。在应用中,这种绘图为非负性神经网络所近似。然而,事实证明,培训非负性网络的过程需要对模型的重量施加人为的限制。然而,在特定非负性数据的情况下,不能说如果该绘图是非负性的,它只有非负性的份量。因此,我们考虑了一般神经网络存在固定点的问题,假定与具体锥体相容的条件相近。这并没有放松物理假设,因为即使假设输入和输出是非负性的,其重量(小的,但)小于零值。这种特性(经常见于关于神经网络重量可解释性的文件中)导致关于一般神经网络的假设的削弱。因此,假设一般神经网络存在固定点,假定与具体锥体的相容性条件相近。这并没有放松物理假设,因为即使假定输入和产出是非负性的,但重量可能比零值。这种特性(小,但从关于神经网络的可判读性的文献中往往发现有非负性。)导致关于这一单一性网络的假设的假设的精度研究。