We generalize the classical universal approximation theorem for neural networks to the case of complex-valued neural networks. Precisely, we consider feedforward networks with a complex activation function $\sigma : \mathbb{C} \to \mathbb{C}$ in which each neuron performs the operation $\mathbb{C}^N \to \mathbb{C}, z \mapsto \sigma(b + w^T z)$ with weights $w \in \mathbb{C}^N$ and a bias $b \in \mathbb{C}$, and with $\sigma$ applied componentwise. We completely characterize those activation functions $\sigma$ for which the associated complex networks have the universal approximation property, meaning that they can uniformly approximate any continuous function on any compact subset of $\mathbb{C}^d$ arbitrarily well. Unlike the classical case of real networks, the set of "good activation functions" which give rise to networks with the universal approximation property differs significantly depending on whether one considers deep networks or shallow networks: For deep networks with at least two hidden layers, the universal approximation property holds as long as $\sigma$ is neither a polynomial, a holomorphic function, or an antiholomorphic function. Shallow networks, on the other hand, are universal if and only if the real part or the imaginary part of $\sigma$ is not a polyharmonic function.
翻译:我们将神经网络的古典通用近似近似理论推广到具有复杂价值的神经网络。 确切地说, 我们考虑以复杂的激活功能来向前网络提供 $\ sgmam :\ mathbb{ C}\ to\ mathbb{ C}C} 美元, 每个神经网络运行运行操作 $\ mathbb{ C\\\ n\ to mathb{ C}, z\ mapto\ sgma( b+ w ⁇ T z) 美元, 重量为 $w\ in\ mathb{ C\\ n$ $ 和偏差 $b\ mathb{ C}, 并使用 $\ gmam 美元应用元元元元元 。 我们完全描述这些激活函数 $\ gmam $ $, 相关复杂网络具有通用近似属性的, 意思是 $mallball comm 或 浅色网络 。, 它们能够一致地 任何连续的连续的功能 。 。 与普通网络 。 集合, 集合 而不是普通网络, 或浅色 或浅色 的 的 函数 。 。