Robust Bayesian inference using density power divergence (DPD) has emerged as a promising approach for handling outliers in statistical estimation. While the DPD-based posterior offers theoretical guarantees for robustness, its practical implementation faces significant computational challenges, particularly for general parametric models with intractable integral terms. These challenges become especially pronounced in high-dimensional settings where traditional numerical integration methods prove inadequate and computationally expensive. We propose a novel sampling methodology that addresses these limitations by integrating the loss-likelihood bootstrap with a stochastic gradient descent algorithm specifically designed for DPD-based estimation. Our approach enables efficient and scalable sampling from DPD-based posteriors for a broad class of parametric models, including those with intractable integrals, and we further extend it to accommodate generalized linear models. Through comprehensive simulation studies, we demonstrate that our method efficiently samples from DPD-based posteriors, offering superior computational scalability compared to conventional methods, particularly in high-dimensional settings. The results also highlight its ability to handle complex parametric models with intractable integral terms.
翻译:暂无翻译