We propose a test of many zero parameter restrictions in a high dimensional linear iid regression model with $k$ $>>$ $n$ regressors. The test statistic is formed by estimating key parameters one at a time based on many low dimension regression models with nuisance terms. The parsimoniously parametrized models identify whether the original parameter of interest is or is not zero. Estimating fixed low dimension sub-parameters ensures greater estimator accuracy, it does not require a sparsity assumption nor therefore a regularized estimator, it is computationally fast compared to, e.g., de-biased Lasso, and using only the largest in a sequence of weighted estimators reduces test statistic complexity and therefore estimation error. We provide a parametric wild bootstrap for p-value computation, and prove the test is consistent and has non-trivial $\sqrt{n/\{\ln (n)\mathcal{M}% _{n}\}}$-local-to-null power where $\mathcal{M}_{n}$ is the $l_{\infty }$ covariate fourth moment.
翻译:暂无翻译