Until recently, applications of neural networks in machine learning have almost exclusively relied on real-valued networks. It was recently observed, however, that complex-valued neural networks (CVNNs) exhibit superior performance in applications in which the input is naturally complex-valued, such as MRI fingerprinting. While the mathematical theory of real-valued networks has, by now, reached some level of maturity, this is far from true for complex-valued networks. In this paper, we analyze the expressivity of complex-valued networks by providing explicit quantitative error bounds for approximating $C^n$ functions on compact subsets of $\mathbb{C}^d$ by complex-valued neural networks that employ the modReLU activation function, given by $\sigma(z) = \mathrm{ReLU}(|z| - 1) \, \mathrm{sgn} (z)$, which is one of the most popular complex activation functions used in practice. We show that the derived approximation rates are optimal (up to log factors) in the class of modReLU networks with weights of moderate growth.
翻译:直到最近,神经网络在机器学习中的应用几乎完全依赖实际价值的网络。然而,最近人们发现,复杂价值的神经网络(CVNNs)在应用中表现优异,其投入自然具有复杂价值,例如MRI指纹。虽然实际价值网络的数学理论已经达到某种成熟程度,但对于复杂价值的网络来说,这一点远非如此。在本文件中,我们通过提供明确的量化错误界限来分析复杂价值网络的表达性,以近似于美元(c)美元(C)的紧凑子集的美元(CVNNS)的功能,而复杂价值的神经网络使用ModReLU的激活功能,根据 $\sgma(z) =\ mathrm{RU}(z)-1)\\\\ mathrm{sgn}(z)\ sgn}(z)$,这是实践中最受欢迎的复杂激活功能之一。我们表明,在MdReLU网络类别中,衍生的近率是最佳(直至记录因素)。