Let $\{P_{\theta}:\theta \in {\mathbb R}^d\}$ be a log-concave location family with $P_{\theta}(dx)=e^{-V(x-\theta)}dx,$ where $V:{\mathbb R}^d\mapsto {\mathbb R}$ is a known convex function and let $X_1,\dots, X_n$ be i.i.d. r.v. sampled from distribution $P_{\theta}$ with an unknown location parameter $\theta.$ The goal is to estimate the value $f(\theta)$ of a smooth functional $f:{\mathbb R}^d\mapsto {\mathbb R}$ based on observations $X_1,\dots, X_n.$ In the case when $V$ is sufficiently smooth and $f$ is a functional from a ball in a H\"older space $C^s,$ we develop estimators of $f(\theta)$ with minimax optimal error rates measured by the $L_2({\mathbb P}_{\theta})$-distance as well as by more general Orlicz norm distances. Moreover, we show that if $d\leq n^{\alpha}$ and $s>\frac{1}{1-\alpha},$ then the resulting estimators are asymptotically efficient in H\'ajek-LeCam sense with the convergence rate $\sqrt{n}.$ This generalizes earlier results on estimation of smooth functionals in Gaussian shift models. The estimators have the form $f_k(\hat \theta),$ where $\hat \theta$ is the maximum likelihood estimator and $f_k: {\mathbb R}^d\mapsto {\mathbb R}$ (with $k$ depending on $s$) are functionals defined in terms of $f$ and designed to provide a higher order bias reduction in functional estimation problem. The method of bias reduction is based on iterative parametric bootstrap and it has been successfully used before in the case of Gaussian models.
翻译:Let $\\ p@ p@ testa} :\\\ a\ a\ a\ a\ a\ a\ a\ a\ a\ a\\\\ mbR} $,讓 $X_ 1,\ dots, X_n美元是 i.d. d. r.v. 樣式從發送 $P _theta} 中抽取的美元,一個未知的位置參數 $Ta} (dx) = = =x- V(x- theta) dx, 美元 =dxxx, 美元, 美元====================================xxxxxxx: ======xxxxxxx, ========xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx