We propose a new, two-step empirical Bayes-type of approach for neural networks. We show in context of the nonparametric regression model that the procedure (up to a logarithmic factor) provides optimal recovery of the underlying functional parameter of interest and provides Bayesian credible sets with frequentist coverage guarantees. The approach requires fitting the neural network only once, hence it is substantially faster than Bootstrapping type approaches. We demonstrate the applicability of our method over synthetic data, observing good estimation properties and reliable uncertainty quantification.
翻译:我们为神经网络提出了一种新的、两步的经验性贝耶斯型神经网络方法,在非参数回归模型的背景下,我们表明该程序(直至对数系数)为最佳恢复基本功能参数提供了最佳恢复,并为贝耶斯人可信的成套系统提供了常年覆盖保障。该方法只要求安装一次神经网络,因此大大快于推进型方法。我们证明了我们的方法对合成数据的适用性,观察了良好的估计特性和可靠的不确定性量化。