Series and polynomial regression are able to approximate the same function classes as neural networks. However, these methods are rarely used in practice, although they offer more interpretability than neural networks. In this paper, we show that a potential reason for this is the slow convergence rate of polynomial regression estimators and propose the use of bagged polynomial regression (BPR) as an attractive alternative to neural networks. Theoretically, we derive new finite sample and asymptotic $L^2$ convergence rates for series estimators. We show that the rates can be improved in smooth settings by splitting the feature space and generating polynomial features separately for each partition. Empirically, we show that our proposed estimator, the BPR, can perform as well as more complex models with more parameters. Our estimator also performs close to state-of-the-art prediction methods in the benchmark MNIST handwritten digit dataset.
翻译:序列和多元回归可以与神经网络大致相似的功能类别。 但是,这些方法在实践中很少使用,尽管它们比神经网络更具有解释性。 在本文中,我们表明,其潜在原因可能是多元回归测算器的缓慢趋同率,并提议使用包装多面回归(BPR)作为神经网络的有吸引力的替代物。从理论上讲,我们为测算器序列得出新的有限样本和无症状的汇合率 $L ⁇ 2$。我们表明,通过将地貌空间分割开来,为每个分区分别产生多面性特征,可以顺利地在设置中提高比率。我们自然地表明,我们提议的测算器BPR可以发挥与更多参数的复杂模型一样的功能。我们的测算器在基准的MNIST手写数字数据集中也接近于最先进的预测方法。