General nonlinear sieve learnings are classes of nonlinear sieves that can approximate nonlinear functions of high dimensional variables much more flexibly than various linear sieves (or series). This paper considers general nonlinear sieve quasi-likelihood ratio (GN-QLR) based inference on expectation functionals of time series data, where the functionals of interest are based on some nonparametric function that satisfy conditional moment restrictions and are learned using multilayer neural networks. While the asymptotic normality of the estimated functionals depends on some unknown Riesz representer of the functional space, we show that the optimally weighted GN-QLR statistic is asymptotically Chi-square distributed, regardless whether the expectation functional is regular (root-$n$ estimable) or not. This holds when the data are weakly dependent beta-mixing condition. We apply our method to the off-policy evaluation in reinforcement learning, by formulating the Bellman equation into the conditional moment restriction framework, so that we can make inference about the state-specific value functional using the proposed GN-QLR method with time series data. In addition, estimating the averaged partial means and averaged partial derivatives of nonparametric instrumental variables and quantile IV models are also presented as leading examples. Finally, a Monte Carlo study shows the finite sample performance of the procedure
翻译:普通非线性 sieve 学习是非线性 Sieve 的类别,它们可以比各种线性sieves(或系列)更灵活地近似高维变量的非线性功能。本文根据时间序列数据的预期功能推论了一般非线性sive 准类似比(GN-QLR), 感兴趣的功能基于某些非线性功能, 满足有条件时刻限制, 并使用多层神经网络学习。 虽然估计功能的无线性常态取决于功能空间中一些未知的Riesz 代表器, 但我们显示, 最佳加权的GN- QLR统计是无线性分布的, 不论预期功能是否正常( root- $n 可估量), 是否正常。 当数据依赖性乙级混合条件时, 我们运用我们的方法进行非政策评价, 将Bellman 方程式纳入有条件时间限制框架, 以便我们可以用拟议的平均时间框架来推断州- 特定值的GNLRRS 统计统计, 也用拟议的平均比例 工具来显示 AS IM AS AS AS ASyal AS AS AS AS AS AS AL AS ASyal ASyal ex ex 方法, AL AL AL ASV AL ASyal AL AS AS AS AL AS AS AS AL AS AS AL AS AL AS AL AL ASV AS AS AS AS AS AL AL AL AL AL AL AL AL AL AL AL MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA MA