The conditional moment problem is a powerful formulation for describing structural causal parameters in terms of observables, a prominent example being instrumental variable regression. A standard approach reduces the problem to a finite set of marginal moment conditions and applies the optimally weighted generalized method of moments (OWGMM), but this requires we know a finite set of identifying moments, can still be inefficient even if identifying, or can be theoretically efficient but practically unwieldy if we use a growing sieve of moment conditions. Motivated by a variational minimax reformulation of OWGMM, we define a very general class of estimators for the conditional moment problem, which we term the variational method of moments (VMM) and which naturally enables controlling infinitely-many moments. We provide a detailed theoretical analysis of multiple VMM estimators, including ones based on kernel methods and neural nets, and provide conditions under which these are consistent, asymptotically normal, and semiparametrically efficient in the full conditional moment model. We additionally provide algorithms for valid statistical inference based on the same kind of variational reformulations, both for kernel- and neural-net-based varieties. Finally, we demonstrate the strong performance of our proposed estimation and inference algorithms in a detailed series of synthetic experiments.
翻译:条件矩问题是一种有效的公式,用于描述结构性因果参数与可观察量的关系,其中最突出的例子就是工具变量回归。标准的方法将问题转化为一组有限的边际矩条件,并应用最佳加权广义矩方法(OWGMM)。但是,这需要我们知道一组有限的识别矩,甚至如果进行了识别仍然可能效率低,或者如果使用不断增加矩条件的筛选方法,则理论上高效但实际上不易操纵。受变分极小极大化OWGMM的启发,我们定义了一种非常普遍的估计条件矩问题的方法,我们称之为矩估计的变分法(VMM),它自然地能够控制无限数量的矩。我们对多个VMM估计器进行了详细的理论分析,包括基于核方法和神经网络的估计器,并提供了在全条件矩模型下这些估计器一致、渐进正常、半参数高效的条件。我们还提供了基于相同种类的变分重构的有效统计推断算法,包括基于核和神经网络的方法。最后,我们在详细的一系列合成实验中展示了我们所提出的估计和推断算法的强大性能。