We show how the random coefficient logistic demand (BLP) model can be phrased as an automatically differentiable moment function, including the incorporation of numerical safeguards proposed in the literature. This allows gradient-based frequentist and quasi-Bayesian estimation using the Continuously Updating Estimator (CUE). Drawing from the machine learning literature, we outline hitherto under-utilized best practices in both frequentist and Bayesian estimation techniques. Our Monte Carlo experiments compare the performance of CUE, 2S-GMM, and LTE estimation. Preliminary findings indicate that the CUE estimated using LTE and frequentist optimization has a lower bias but higher MAE compared to the traditional 2-Stage GMM (2S-GMM) approach. We also find that using credible intervals from MCMC sampling for the non-linear parameters together with frequentist analytical standard errors for the concentrated out linear parameters provides empirical coverage closest to the nominal level. The accompanying admest Python package provides a platform for replication and extensibility.
翻译:我们展示了随机系数后勤需求(BLP)模式如何被描述为自动的差别时刻功能,包括纳入文献中提议的数字保障。这允许使用不断更新的模拟器(CUE)进行基于梯度的常客和准巴耶斯估计。根据机器学习文献,我们概述了过去在常客和巴耶斯估计技术方面尚未充分利用的最佳做法。我们的蒙特卡洛实验比较了CUE、2S-GMM和LTE估计的性能。初步调查结果表明,使用LTE和常客优化估计的CUE估计值与传统的2-Stage GMM (2-S-GMM)方法相比,偏差但MAE值更高。我们还发现,利用MCMC取样的可靠间隔的非线性参数以及集中线性参数的频繁分析标准错误,提供了最接近名义水平的经验覆盖。随附的Admest Python软件包提供了复制和扩展的平台。