This work studies the computational aspects of multivariate convex regression in dimensions $d \ge 5$. Our results include the \emph{first} estimators that are minimax optimal (up to logarithmic factors) with polynomial runtime in the sample size for both $L$-Lipschitz convex regression, and $Γ$-bounded convex regression under polytopal support. Our analysis combines techniques from empirical process theory, stochastic geometry, and potential theory, and leverages recent algorithmic advances in mean estimation for random vectors and in distribution-free linear regression. These results provide the first efficient, minimax-optimal procedures for non-Donsker classes for which their corresponding least-squares estimator is provably minimax-suboptimal.
翻译:本研究探讨了维度 $d \ge 5$ 下多元凸回归的计算问题。我们的成果包含了在样本量上具有多项式运行时间的极小极大最优(至多相差对数因子)估计器,这针对 $L$-Lipschitz 凸回归和在有界多面体支撑下的 $Γ$-有界凸回归均是首次实现。我们的分析综合了经验过程理论、随机几何和势理论的技术,并利用了随机向量均值估计和无分布线性回归领域的最新算法进展。这些结果为非邓斯克尔类提供了首个高效且极小极大最优的估计流程,而针对这些类别,其对应的最小二乘估计器已被证明是极小极大次优的。