Several researchers have proposed minimisation of maximum mean discrepancy (MMD) as a method to quantise probability measures, i.e., to approximate a target distribution by a representative point set. We consider sequential algorithms that greedily minimise MMD over a discrete candidate set. We propose a novel non-myopic algorithm and, in order to both improve statistical efficiency and reduce computational cost, we investigate a variant that applies this technique to a mini-batch of the candidate set at each iteration. When the candidate points are sampled from the target, the consistency of these new algorithm - and their mini-batch variants - is established. We demonstrate the algorithms on a range of important computational problems, including optimisation of nodes in Bayesian cubature and the thinning of Markov chain output.
翻译:若干研究人员提议尽量减少最大平均差异(MMD),作为衡量概率的一种方法,即按代表点对目标分布进行大致估计。我们考虑了在离散候选人组中将MMD最小化的顺序算法。我们提出了一种新的非中位算法,并且为了提高统计效率和降低计算成本,我们调查了一种将这一技术应用于每次迭代所选候选人小批数的变方。当从目标中抽取候选点时,这些新算法及其微型批量变方的一致性已经确立。我们展示了一系列重要的计算问题的算法,其中包括对巴伊西亚烹饪节点的优化和马科夫链输出的减薄。