We propose a new method for estimating the minimizer $\boldsymbol{x}^*$ and the minimum value $f^*$ of a smooth and strongly convex regression function $f$ from the observations contaminated by random noise. Our estimator $\boldsymbol{z}_n$ of the minimizer $\boldsymbol{x}^*$ is based on a version of the projected gradient descent with the gradient estimated by a regularized local polynomial algorithm. Next, we propose a two-stage procedure for estimation of the minimum value $f^*$ of regression function $f$. At the first stage, we construct an accurate enough estimator of $\boldsymbol{x}^*$, which can be, for example, $\boldsymbol{z}_n$. At the second stage, we estimate the function value at the point obtained in the first stage using a rate optimal nonparametric procedure. We derive non-asymptotic upper bounds for the quadratic risk and optimization error of $\boldsymbol{z}_n$, and for the risk of estimating $f^*$. We establish minimax lower bounds showing that, under certain choice of parameters, the proposed algorithms achieve the minimax optimal rates of convergence on the class of smooth and strongly convex functions.
翻译:我们提出一个新的方法来估计最小值$\boldsymbol{x}}$,以及从随机噪音污染的观测中估算平滑和强烈的曲线回归函数最低值$$f$的美元。 我们的顶点$\boldsymbol{z ⁇ n$, 我们的顶点$\boldsysybol{x}}$boldsysymbol{x}$的美元, 以一个版本的预测梯度下降值为基础, 并用一个正规化的本地多元方程式估算梯度。 其次, 我们提出一个两阶段程序, 用于估算回归函数最小值最低值的最小值 $f 美元。 在第一阶段, 我们构建一个精确的 $\boldsybol{z}x$的估测算器 。 这可以是, 比如, $\boldsymexymassblation 的精确度, 我们用一种最优的算法参数来估算在第一阶段获得的临界值。