In this technical report, we explore the behavior of Recursive Feature Machines (RFMs), a type of novel kernel machine that recursively learns features via the average gradient outer product, through a series of experiments on regression datasets. When successively adding random noise features to a dataset, we observe intriguing patterns in the Mean Squared Error (MSE) curves with the test MSE exhibiting a decrease-increase-decrease pattern. This behavior is consistent across different dataset sizes, noise parameters, and target functions. Interestingly, the observed MSE curves show similarities to the "double descent" phenomenon observed in deep neural networks, hinting at new connection between RFMs and neural network behavior. This report lays the groundwork for future research into this peculiar behavior.
翻译:在这篇技术报告中,我们通过对回归数据集进行一系列实验,探索了Recursive Feature Machines(RFM)行为,这是一种通过平均梯度外积递归地学习特征的新型核机器。当将随机噪声特征连续添加到数据集中时,我们观察到均方误差(MSE)曲线中出现了有趣的模式,测试MSE表现出下降-上升-下降的模式。这种行为在不同的数据集大小、噪声参数和目标函数中表现出一致性。有趣的是,观察到的MSE曲线与深度神经网络中观察到的“双下降”现象相似,暗示着RFM和神经网络行为之间的新连接。这篇报告为未来进一步研究这种奇特行为打下了基础。