This note addresses the Kolmogorov-Arnold Representation Theorem (KART) and the Universal Approximation Theorem (UAT), focusing on their frequent misinterpretations found in the neural network literature. Our remarks aim to support a more accurate understanding of KART and UAT among neural network specialists. In addition, we explore the minimal number of neurons required for universal approximation, showing that the same number of neurons needed for exact representation of functions in KART-based networks also suffices for standard multilayer perceptrons in the context of approximation.
翻译:本文针对Kolmogorov-Arnold表示定理(KART)与通用逼近定理(UAT),聚焦于神经网络文献中对其常见的误解进行探讨。我们的论述旨在帮助神经网络领域研究者更准确地理解KART与UAT。此外,我们探究了实现通用逼近所需的最小神经元数量,证明基于KART的网络中精确表示函数所需的神经元数量,在逼近意义上同样适用于标准多层感知机。