In this paper, we study approximation properties of single hidden layer neural networks with weights varying on finitely many directions and thresholds from an open interval. We obtain a necessary and at the same time sufficient measure theoretic condition for density of such networks in the space of continuous functions. Further, we prove a density result for neural networks with a specifically constructed activation function and a fixed number of neurons.
翻译:有限权重神经网络逼近的测度理论结果
翻译后的摘要:
在本文中,我们研究了单隐层神经网络的逼近性质,该网络的权重在有限个方向上变化,并从一个开放区间中获取得到阈值。我们获得了一个必要且同时充分的测度理论条件,用于说明此类网络在连续函数空间中的密度。此外,我们证明了一种特殊构造激活函数和固定数量神经元的神经网络的密度结果。