Many causal and structural parameters are linear functionals of an underlying regression. The Riesz representer is a key component in the asymptotic variance of a semiparametrically estimated linear functional. We propose an adversarial framework to estimate the Riesz representer using general function spaces. We prove a nonasymptotic mean square rate in terms of an abstract quantity called the critical radius, then specialize it for neural networks, random forests, and reproducing kernel Hilbert spaces as leading cases. Furthermore, we use critical radius theory -- in place of Donsker theory -- to prove asymptotic normality without sample splitting, uncovering a ``complexity-rate robustness'' condition. This condition has practical consequences: inference without sample splitting is possible in several machine learning settings, which may improve finite sample performance compared to sample splitting. Our estimators achieve nominal coverage in highly nonlinear simulations where previous methods break down. They shed new light on the heterogeneous effects of matching grants.
翻译:暂无翻译