We study a nonparametric approach to Bayesian computation via feature means, where the expectation of prior features is updated to yield expected kernel posterior features, based on regression from learned neural net or kernel features of the observations. All quantities involved in the Bayesian update are learned from observed data, making the method entirely model-free. The resulting algorithm is a novel instance of a kernel Bayes' rule (KBR), based on importance weighting. This results in superior numerical stability to the original approach to KBR, which requires operator inversion. We show the convergence of the estimator using a novel consistency analysis on the importance weighting estimator in the infinity norm. We evaluate KBR on challenging synthetic benchmarks, including a filtering problem with a state-space model involving high dimensional image observations. Importance weighted KBR yields uniformly better empirical performance than the original KBR, and competitive performance with other competing methods.
翻译:我们研究了一种非对称方法,通过特征手段计算贝叶斯人,根据从观测中学到的神经网或内核特征的回归,更新对先前特征的期望,以产生预期的内核后部特征。巴伊斯人更新的所有数量都从观测到的数据中学习,使方法完全没有模型。由此得出的算法是内核贝斯规则(KBR)的一个新例子,其依据是重要性加权。这导致对KBR的原始方法的数值稳定性较高,这需要操作者倒置。我们用关于无限规范中加权估量的重要性的新的一致性分析来显示估算器的趋同性。我们评估KBR对具有挑战性的合成基准,包括涉及高维图像观测的州空间模型的过滤问题。加权KBR产生比原始KBR的一致的经验性表现,以及与其他竞合方法的竞争性表现。