This paper is about the feasibility and means of root-n consistently estimating linear, mean-square continuous functionals of a high dimensional, approximately sparse regression. Such objects include a wide variety of interesting parameters such as regression coefficients, average derivatives, and the average treatment effect. We give lower bounds on the convergence rate of estimators of a regression slope and an average derivative and find that these bounds are substantially larger than in a low dimensional, semiparametric setting. We also give debiased machine learners that are root-n consistent under either a minimal approximate sparsity condition or rate double robustness. These estimators improve on existing estimators in being root-n consistent under more general conditions that previously known.
翻译:本文涉及直线直线直线、平均平方连续功能高维、近乎稀少的回归的可行性和手段。这些物体包括一系列有趣的参数,如回归系数、平均衍生物和平均处理效果等。我们对回归坡和平均衍生物估计值的趋同率给出较低的界限,并发现这些界限大大大于低维半对数设置中的界限。我们还给在极小的近似聚度条件下或双强度下具有根一致性的有偏差的机器学习者提供分寸的学习者。这些估计数在以前已知的更普遍的条件下,使现有的估计值在根一致方面有所改进。