We study the minimax estimation of $\alpha$-divergences between discrete distributions for integer $\alpha\ge 1$, which include the Kullback--Leibler divergence and the $\chi^2$-divergences as special examples. Dropping the usual theoretical tricks to acquire independence, we construct the first minimax rate-optimal estimator which does not require any Poissonization, sample splitting, or explicit construction of approximating polynomials. The estimator uses a hybrid approach which solves a problem-independent linear program based on moment matching in the non-smooth regime, and applies a problem-dependent bias-corrected plug-in estimator in the smooth regime, with a soft decision boundary between these regimes.
翻译:我们研究整数 $\ alpha$\ ge 1 的离散分布之间的最小最大估计值,包括 Kullback- Leibel 差异和 $\ 2 $- divegence 之间的最小最大估计值作为特殊例子。 放弃通常的理论技巧以获得独立, 我们构建第一个微型最大比率- 最佳估计器, 不需要任何 Poissonizion、 样本分割或明确构建相近的多数值。 估计器使用混合法, 解决一个基于非mooth 系统中的瞬间匹配的依赖于问题的线性程序, 在光滑的系统中应用一个依赖问题的偏差修正插点, 并在这些制度之间有软的决定界限 。