This paper considers the hyperparameter optimization problem of mathematical techniques that arise in the numerical solution of differential and integral equations. The well-known approaches grid and random search, in a parallel algorithm manner, are developed to find the optimal set of hyperparameters. Employing rational Jacobi functions, we ran these algorithms on two nonlinear benchmark differential equations on the semi-infinite domain. The configurations contain different rational mappings along with their length scale parameter and the Jacobi functions parameters. These trials are configured on the collocation Least-Squares Support Vector Regression (CLS-SVR), a novel numerical simulation approach based on spectral methods. In addition, we have addressed the sensitivity of these hyperparameters on the numerical stability and convergence of the CLS-SVR model. The experiments show that this technique can effectively improve state-of-the-art results.
翻译:暂无翻译