We propose an efficient and easy-to-implement gradient-enhanced least squares Monte Carlo method for computing price and Greeks (i.e., derivatives of the price function) of high-dimensional American options. It employs the sparse Hermite polynomial expansion as a surrogate model for the continuation value function, and essentially exploits the fast evaluation of gradients. The expansion coefficients are computed by solving a linear least squares problem that is enhanced by gradient information of simulated paths. We analyze the convergence of the proposed method, and establish an error estimate in terms of the best approximation error in the weighted $H^1$ space, the statistical error of solving discrete least squares problems, and the time step size. We present comprehensive numerical experiments to illustrate the performance of the proposed method. The results show that it outperforms the state-of-the-art least squares Monte Carlo method with more accurate price, Greeks, and optimal exercise strategies in high dimensions but with nearly identical computational cost, and it can deliver comparable results with recent neural network-based methods up to dimension 100.
翻译:暂无翻译