Inference on modern Bayesian Neural Networks (BNNs) often relies on a variational inference treatment, imposing violated assumptions of independence and the form of the posterior. Traditional MCMC approaches avoid these assumptions at the cost of increased computation due to its incompatibility to subsampling of the likelihood. New Piecewise Deterministic Markov Process (PDMP) samplers permit subsampling, though introduce a model specific inhomogenous Poisson Process (IPPs) which is difficult to sample from. This work introduces a new generic and adaptive thinning scheme for sampling from these IPPs, and demonstrates how this approach can accelerate the application of PDMPs for inference in BNNs. Experimentation illustrates how inference with these methods is computationally feasible, can improve predictive accuracy, MCMC mixing performance, and provide informative uncertainty measurements when compared against other approximate inference schemes.
翻译:现代贝叶斯神经网络(BNNs)的推论往往依赖于变式推论处理,强加了违反的独立假设和后人形式的假设。传统的MCMC方法避免了这些假设,其代价是,由于它与对可能性的次抽样不相容而增加了计算。新的小半决定性Markov(PDMP)过程取样员允许进行次抽样,尽管引入了一种难以取样的模型性能不均匀的Poisson(IPPs)过程(IPPs ) 。这项工作为这些IPP采样的采样引入了新的通用和适应性稀释计划,并展示了这种方法如何加速将PDMPs应用于BNS的推断。实验表明,与这些方法的推论在计算上是可行的,能够提高预测的准确性,MCMC混合性性能,并且与其他近似的推论方法相比,提供信息性不确定性测量。