Thanks to its fine balance between model flexibility and interpretability, the nonparametric additive model has been widely used, and variable selection for this type of model has been frequently studied. However, none of the existing solutions can control the false discovery rate (FDR) unless the sample size tends to infinity. The knockoff framework is a recent proposal that can address this issue, but few knockoff solutions are directly applicable to nonparametric models. In this article, we propose a novel kernel knockoffs selection procedure for the nonparametric additive model. We integrate three key components: the knockoffs, the subsampling for stability, and the random feature mapping for nonparametric function approximation. We show that the proposed method is guaranteed to control the FDR for any sample size, and achieves a power that approaches one as the sample size tends to infinity. We demonstrate the efficacy of our method through intensive simulations and comparisons with the alternative solutions. Our proposal thus makes useful contributions to the methodology of nonparametric variable selection, FDR-based inference, as well as knockoffs.
翻译:由于其在模型灵活性和可解释性之间的细微平衡,非参数添加模型已被广泛使用,并经常研究此类模型的可变选择。然而,现有的解决方案中没有一个能够控制虚假发现率(FDR),除非样本大小倾向于无限。淘汰框架是最近提出的一个可以解决这一问题的建议,但没有几个淘汰解决方案直接适用于非参数模型。在本条中,我们为非参数添加模型提出了一个新的内核取舍选择程序。我们整合了三个关键组成部分:制式、稳定性子抽样和非参数函数近似随机特征绘图。我们表明,所拟议的方法可以保证在任何样本大小上控制FDR,并取得一种接近样本大小的无限能力。我们通过密集的模拟和与替代解决方案的比较来展示我们方法的功效。因此,我们的提案为非参数变量选择方法、基于FDR的推论以及制导论作出了有益的贡献。