Bayesian inference with empirical likelihood faces a challenge as the posterior domain is a proper subset of the original parameter space due to the convex hull constraint. We propose a regularized exponentially tilted empirical likelihood to address this issue. Our method removes the convex hull constraint using a novel regularization technique, incorporating a continuous exponential family distribution to satisfy a Kullback-Leibler divergence criterion. The regularization arises as a limiting procedure where pseudo-data are added to the formulation of exponentially tilted empirical likelihood in a structured fashion. We show that this regularized exponentially tilted empirical likelihood retains certain desirable asymptotic properties of (exponentially tilted) empirical likelihood and has improved finite sample performance. Simulation and data analysis demonstrate that the proposed method provides a suitable pseudo-likelihood for Bayesian inference. The implementation of our method is available as the R package retel. Supplementary materials for this article are available online.
翻译:暂无翻译