The problem of computing posterior functionals in general high-dimensional statistical models with possibly non-log-concave likelihood functions is considered. Based on the proof strategy of [49], but using only local likelihood conditions and without relying on M-estimation theory, nonasymptotic statistical and computational guarantees are provided for a gradient based MCMC algorithm. Given a suitable initialiser, these guarantees scale polynomially in key algorithmic quantities. The abstract results are applied to several concrete statistical models, including density estimation, nonparametric regression with generalised linear models and a canonical statistical non-linear inverse problem from PDEs.
翻译:暂无翻译