In this study, we investigate the bias and variance properties of the debiased Lasso in linear regression when the tuning parameter of the node-wise Lasso is selected to be smaller than in previous studies. We consider the case where the number of covariates $p$ is bounded by a constant multiple of the sample size $n$. First, we show that the bias of the debiased Lasso can be reduced without diverging the asymptotic variance by setting the order of the tuning parameter to $1/\sqrt{n}$.This implies that the debiased Lasso has asymptotic normality provided that the number of nonzero coefficients $s_0$ satisfies $s_0=o(\sqrt{n/\log p})$, whereas previous studies require $s_0 =o(\sqrt{n}/\log p)$ if no sparsity assumption is imposed on the precision matrix. Second, we propose a data-driven tuning parameter selection procedure for the node-wise Lasso that is consistent with our theoretical results. Simulation studies show that our procedure yields confidence intervals with good coverage properties in various settings. We also present a real economic data example to demonstrate the efficacy of our selection procedure.
翻译:在本研究中,当节点的拉索调制参数被选中小于以前的研究时,我们调查线性回归中偏差拉索偏差的偏差和差异性。我们考虑的是共差美元数受一个恒定的多个样本大小的美元约束的情况。首先,我们表明,如果调制参数的顺序定在1美元/ sqrt{n}美元,则线性回归中偏差的拉索偏差的偏差和差异性。这意味着,低偏差的拉索的平庸正常性使得非零系数数数数满足$_0=o(sqrt{n/\log p}),而以前的研究需要美元=0=o(sqrt{n}/\log p),而不会在无症状差异的情况下减少。第二,我们建议以数据驱动的参数选择程序对节点的拉索进行调控性正常性,这也符合我们目前的理论性测算结果。我们所做的模拟研究显示我们的经济性能的准确性能,我们的数据性能显示了我们良好的测试程序。