The nonparametric view of Bayesian inference has transformed statistics and many of its applications. The canonical Dirichlet process and other more general families of nonparametric priors have served as a gateway to solve frontier uncertainty quantification problems of large, or infinite, nature. This success has been greatly due to available constructions and representations of such distributions, which in turn have lead to a variety of sampling schemes. Undoubtedly, the two most useful constructions are the one based on normalization of homogeneous completely random measures and that based on stick-breaking processes, as well as various particular cases. Understanding their distributional features and how different random probability measures compare among themselves is a key ingredient for their proper application. In this paper, we explore the prior discrepancy, through a divergence-based analysis, of extreme classes of stick-breaking processes. Specifically, we investigate the random Kullback-Leibler divergences between the Dirichlet process and the geometric process, as well as some of their moments. Furthermore, we also perform the analysis within the general exchangeable stick-breaking class of nonparametric priors, leading to appealing results.
翻译:暂无翻译