This paper introduces a tensor neural network (TNN) to address nonparametric regression problems. Characterized by its distinct sub-network structure, the TNN effectively facilitates variable separation, thereby enhancing the approximation of complex, unknown functions. Our comparative analysis reveals that the TNN outperforms conventional Feed-Forward Networks (FFN) and Radial Basis Function Networks (RBN) in terms of both approximation accuracy and generalization potential, despite a similar scale of parameters. A key innovation of our approach is the integration of statistical regression and numerical integration within the TNN framework. This integration allows for the efficient computation of high-dimensional integrals associated with the regression function. The implications of this advancement extend to a broader range of applications, particularly in scenarios demanding precise high-dimensional data analysis and prediction.
翻译:暂无翻译