It is often of interest to estimate regression functions non-parametrically. Penalized regression (PR) is one statistically-effective, well-studied solution to this problem. Unfortunately, in many cases, finding exact solutions to PR problems is computationally intractable. In this manuscript, we propose a mesh-based approximate solution (MBS) for those scenarios. MBS transforms the complicated functional minimization of NPR, to a finite parameter, discrete convex minimization; and allows us to leverage the tools of modern convex optimization. We show applications of MBS in a number of explicit examples (including both uni- and multi-variate regression), and explore how the number of parameters must increase with our sample-size in order for MBS to maintain the rate-optimality of NPR. We also give an efficient algorithm to minimize the MBS objective while effectively leveraging the sparsity inherent in MBS.
翻译:通常需要用非参数来估计回归函数。 惩罚性回归(PR)是解决这个问题的一个统计上有效、研究周全的解决办法。 不幸的是,在许多情况下,找到准确解决PR问题的办法是难以计算的。 在本稿中,我们建议为这些假设方案采用基于网状的大致解决办法。 MBS将复杂的NPR最小化功能转换成一个有限参数,将离散的 convex最小化;并使我们能够利用现代二次曲线优化工具。我们在若干明确的例子(包括单变量和多变量回归)中展示了MBS的应用,并探讨了参数数量如何随着我们的抽样规模而增加,以便MBS保持NPR的速率优化。 我们还提供了一种有效的算法,以尽量减少MBS的目标,同时有效地利用MBS固有的宽度。