Interpolation models are critical for a wide range of applications, from numerical optimization to artificial intelligence. The reliability of the provided interpolated value is of utmost importance, and it is crucial to avoid the insurgence of spurious noise. Noise sources can be prevented using proper countermeasures when the training set is designed, but the data sparsity is inevitable in some cases. A typical example is represented by the application of an optimization algorithm: the area where the minimum or maximum of the objective function is assumed to be present is where new data is abundantly added, but other areas of design variable space are significantly neglected. In these cases, a regularization of the interpolation model becomes absolutely crucial. In this paper we are presenting an approach for the regularization of an interpolator based on the control of its kernel function via the condition number of the self-correlation matrix.
翻译:暂无翻译