Bayesian nonparametric mixture models offer a rich framework for model based clustering. We consider the situation where the kernel of the mixture is available only up to an intractable normalizing constant. In this case, most of the commonly used Markov chain Monte Carlo (MCMC) methods are not suitable. We propose an approximate Bayesian computational (ABC) strategy, whereby we approximate the posterior to avoid the intractability of the kernel. We derive an ABC-MCMC algorithm which combines (i) the use of the predictive distribution induced by the nonparametric prior as proposal and (ii) the use of the Wasserstein distance and its connection to optimal matching problems. To overcome the sensibility with respect to the parameters of our algorithm, we further propose an adaptive strategy. We illustrate the use of the proposed algorithm with several simulation studies and an application on real data, where we cluster a population of networks, comparing its performance with standard MCMC algorithms and validating the adaptive strategy.
翻译:Bayesian非参数混合物模型为基于模型的集群提供了丰富的框架。 我们考虑了混合物内核只能达到难以调和的常态的情况。 在这种情况下, 多数常用的Markov链Monte Carlo (MCMCC) 方法并不合适。 我们提出了一个大致的Bayesian计算(ABC)战略, 通过这个战略, 我们比较后代(ABC) 以避免内核的可吸引性。 我们得出ABC-MCMC算法, 该算法结合了(一) 使用由非参数之前的建议引起的预测分布, (二) 使用瓦瑟斯坦距离及其与最佳匹配问题的联系。 为了克服我们算法参数的敏感性,我们进一步提出一项适应性战略。 我们用若干模拟研究和对真实数据的应用来说明拟议的算法的使用情况, 将网络群集中在一起, 将其性能与标准的MCMC算法进行比较, 并验证适应性战略。