We develop polynomial-time algorithms for near-optimal minimax mean estimation under $\ell_2$-squared loss in a Gaussian sequence model under convex constraints. The parameter space is an origin-symmetric, type-2 convex body $K \subset \mathbb{R}^n$, and we assume additional regularity conditions: specifically, we assume $K$ is well-balanced, i.e., there exist known radii $r, R > 0$ such that $r B_2 \subseteq K \subseteq R B_2$, as well as oracle access to the Minkowski gauge of $K$. Under these and some further assumptions on $K$, our procedures achieve the minimax rate up to small factors, depending poly-logarithmically on the dimension, while remaining computationally efficient. We further extend our methodology to the linear regression and robust heavy-tailed settings, establishing polynomial-time near-optimal estimators when the constraint set satisfies the regularity conditions above. To the best of our knowledge, these results provide the first general framework for attaining statistically near-optimal performance under such broad geometric constraints while preserving computational tractability.
翻译:我们针对高斯序列模型中在凸约束条件下基于$\ell_2$平方损失的近最优极小极大均值估计问题,开发了多项式时间算法。参数空间是原点对称的类型-2凸体$K \subset \mathbb{R}^n$,且我们假设额外的正则性条件:具体而言,假设$K$是良平衡的,即存在已知半径$r, R > 0$使得$r B_2 \subseteq K \subseteq R B_2$,同时具备对$K$的闵可夫斯基规范的预言机访问权限。在这些条件及对$K$的进一步假设下,我们的方法能以对维度呈多对数依赖的小因子达到极小最优速率,同时保持计算高效性。我们进一步将方法论拓展至线性回归与鲁棒重尾分布场景,当约束集满足上述正则性条件时,建立了多项式时间近最优估计量。据我们所知,这些结果为在保持计算可处理性的同时,在此类广泛几何约束下实现统计近最优性能提供了首个通用框架。