Series or orthogonal basis regression is one of the most popular non-parametric regression techniques in practice, obtained by regressing the response on features generated by evaluating the basis functions at observed covariate values. The most routinely used series estimator is based on ordinary least squares fitting, which is known to be minimax rate optimal in various settings, albeit under stringent restrictions on the basis functions and the distribution of covariates. In this work, inspired by the recently developed Forster-Warmuth (FW) learner, we propose an alternative series regression estimator that can attain the minimax estimation rate under strictly weaker conditions imposed on the basis functions and the joint law of covariates, than existing series estimators in the literature. Moreover, a key contribution of this work generalizes the FW-learner to a so-called counterfactual regression problem, in which the response variable of interest may not be directly observed (hence, the name ``counterfactual'') on all sampled units, and therefore needs to be inferred in order to identify and estimate the regression in view from the observed data. Although counterfactual regression is not entirely a new area of inquiry, we propose the first-ever systematic study of this challenging problem from a unified pseudo-outcome perspective. In fact, we provide what appears to be the first generic and constructive approach for generating the pseudo-outcome (to substitute for the unobserved response) which leads to the estimation of the counterfactual regression curve of interest with small bias, namely bias of second order. Several applications are used to illustrate the resulting FW-learner including many nonparametric regression problems in missing data and causal inference literature, for which we establish high-level conditions for minimax rate optimality of the proposed FW-learner.
翻译:暂无翻译