We prove the large deviation principle (LDP) for posterior distributions arising from curved exponential families in a parametric setting, allowing misspecification of the model. Moreover, motivated by the so called inverse Sanov Theorem, obtained in a nonparametric setting by Ganesh and O'Connell (1999 and 2000), we study the relationship between the rate function for the LDP studied in this paper, and the one for the LDP for the corresponding maximum likelihood estimators. In our setting, even in the non misspecified case, it is not true in general that the rate functions for posterior distributions and for maximum likelihood estimators are Kullback-Leibler divergences with exchanged arguments. Finally, the results of the paper has some further interest for the case of exponential families with a dual one (see Letac (2021+)).
翻译:我们证明了在参数环境中曲线指数型家庭后部分布的巨大偏差原则(LDP),允许对模型作错误的区分;此外,由于Ganesh和O'Connell在非参数环境下(1999年和2000年)获得的所谓的“Sanov Theorem”,我们研究了本文所研究的“LDP”费率函数与“LDP”对相应最大概率测算员的“LDP”比率函数之间的关系;在我们的设置中,即使在未说明错误的情况下,“后部分布”和“最高概率估计器”的费率函数与相互争论的“Kullback-Leibel”差异一般并不真实;最后,该文件的结果对具有双重作用的“指数式家庭”的情况(见Letac (2021+))。