Recent theoretical studies illustrated that kernel ridgeless regression can guarantee good generalization ability without an explicit regularization. In this paper, we investigate the statistical properties of ridgeless regression with random features and stochastic gradient descent. We explore the effect of factors in the stochastic gradient and random features, respectively. Specifically, random features error exhibits the double-descent curve. Motivated by the theoretical findings, we propose a tunable kernel algorithm that optimizes the spectral density of kernel during training. Our work bridges the interpolation theory and practical algorithm.
翻译:最近的理论研究表明,内核脊椎回归可以保证良好的概括能力,而没有明确的正规化。在本文中,我们调查了无脊椎回归的统计特性,其中含有随机特征和随机梯度梯度下降。我们分别探索了随机梯度和随机特征中各种因素的影响。具体地说,随机特征错误显示了双白曲线。根据理论发现,我们提出了一个可以调控的内核内核算法,在培训期间优化内核的光谱密度。我们的工作将内推理论和实际算法连接起来。