We develop a reproducing kernel Hilbert space (RKHS) framework for nonparametric mean-variance optimization and inference on shape constraints of the optimal rule. We derive statistical properties of the sample estimator and provide rigorous theoretical guarantees, such as asymptotic consistency, a functional central limit theorem, and a finite-sample deviation bound that matches the Monte Carlo rate up to regularization. Building on these findings, we introduce a joint Wald-type statistic to test for shape constraints over finite grids. The approach comes with an efficient computational procedure based on a pivoted Cholesky factorization, facilitating scalability to large datasets. Empirical tests suggest favorably of the proposed methodology.
翻译:暂无翻译