In this paper we consider a new class of RBF (Radial Basis Function) neural networks, in which smoothing factors are replaced with shifts. We prove under certain conditions on the activation function that these networks are capable of approximating any continuous multivariate function on any compact subset of the $d$-dimensional Euclidean space. For RBF networks with finitely many fixed centroids we describe conditions guaranteeing approximation with arbitrary precision.
翻译:在本文中,我们考虑了一类新的径向基函数神经网络,其中平滑因子被替换为平移。我们证明,在激活函数满足一定条件的情况下,这些网络能够逼近欧氏空间中任何紧致子集上的任何连续多元函数。对于具有有限个固定中心的 RBF 网络,我们描述了保证任意精度逼近的条件。