Over the years, sequential Monte Carlo (SMC) and, equivalently, particle filter (PF) theory has gained substantial attention from researchers. However, the performance of the resampling methodology, also known as offspring selection, has not advanced recently. We propose two deterministic offspring selection methods, which strive to minimize the Kullback-Leibler (KL) divergence and the total variation (TV) distance, respectively, between the particle distribution prior and subsequent to the offspring selection. By reducing the statistical distance between the selected offspring and the joint distribution, we obtain a heuristic search procedure that performs superior to a maximum likelihood search in precisely those contexts where the latter performs better than an SMC. For SMC and particle Markov chain Monte Carlo (pMCMC), our proposed offspring selection methods always outperform or compare favorably with the two state-of-the-art resampling schemes on two models commonly used as benchmarks from the literature.
翻译:多年来,相继的蒙特卡洛(SMC)和类似粒子过滤(PF)理论引起了研究人员的极大关注,然而,重新采样方法(又称后代选择)的绩效最近没有取得进步。我们提出了两种决定性的后代选择方法,即努力将后代选择之前和之后的粒子分布之间的差异和总差异(TV)分别缩小到最低程度。通过减少选定后代与共同分布之间的统计距离,我们获得了一种超常的搜索程序,在后者表现优于SMC的准确情况下,最有可能进行搜索。对于SMC和Markov粒子链蒙特卡洛(PMCC),我们提议的后代选择方法总是优于或优于作为文献基准的两个模型的两种状态-艺术再采样计划。