The multi-modal posterior under unidentified nonparametric models yields poor mixing of Markov Chain Monte Carlo (MCMC), which is a stumbling block to Bayesian predictions. In this article, we conceptualize a prior informativeness threshold that is essentially the variance of posterior modes and expressed by the uncertainty hyperparameters of nonparametric priors. The threshold plays the role of a lower bound of the within-chain MCMC variance to ensure MCMC mixing, and engines prior modification through hyperparameter tuning to descend the mode variance. Our method distinguishes from existing postprocessing methods in that it directly samples well-mixed MCMC chains on the unconstrained space, and inherits the original posterior predictive distribution in predictive inference. Our method succeeds in Bayesian survival predictions under an unidentified nonparametric transformation model, guarded by the inferential theories of the posterior variance, under elicitation of two delicate nonparametric priors. Comprehensive simulations and real-world data analysis demonstrate that our method achieves MCMC mixing and outperforms existing approaches in survival predictions.
翻译:暂无翻译