In this paper we consider a new class of RBF (Radial Basis Function) neural networks, in which smoothing factors are replaced with shifts. We prove under certain conditions on the activation function that these networks are capable of approximating any continuous multivariate function on any compact subset of the $d$-dimensional Euclidean space. For RBF networks with finitely many fixed centroids we describe conditions guaranteeing approximation with arbitrary precision.
翻译:在本文中,我们考虑一类新的RBF(径向基函数)神经网络,其中平滑因子被移位代替。我们在激活函数满足一定条件时证明了这些网络能够在欧几里得空间的任何紧致子集上逼近任何连续多元函数。对于具有有限个固定质心的RBF网络,我们描述了保证任意精度逼近的条件。