Including prior information about model parameters is a fundamental step of any Bayesian statistical analysis. It is viewed positively by some as it allows, among others, to quantitatively incorporate expert opinion about model parameters. It is viewed negatively by others because it sets the stage for subjectivity in statistical analysis. Certainly, it creates problems when the inference is skewed due to a conflict with the data collected. According to the theory of conflict resolution (O'Hagan and Pericchi, 2012), a solution to such problems is to diminish the impact of conflicting prior information, yielding inference consistent with the data. This is typically achieved by using heavy-tailed priors. We study both theoretically and numerically the efficacy of such a solution in regression where the prior information about the coefficients takes the form of a product of density functions with known location and scale parameters. We study functions with regularly-varying tails (Student distributions), log-regularly-varying tails (as introduced in Desgagn\'e (2015)), and propose functions with slower tail decays that allow to resolve any conflict that can happen under that regression framework, contrarily to the two previous types of functions. The code to reproduce all numerical experiments is available online.
翻译:将先前的模型参数信息纳入模型参数信息是任何巴伊西亚统计分析的一个基本步骤。 一些人认为这一问题是积极的, 因为它允许从数量上纳入关于模型参数的专家意见。 另一些人则认为它具有消极性, 因为它为统计分析的主观性提供了舞台。 当然, 当与所收集的数据发生冲突导致推论偏斜时, 它会产生问题。 根据冲突解决理论(O'Hagan和Pericchi, 2012年), 解决这些问题的解决方案是减少先前相互矛盾的信息的影响, 产生与数据一致的推断。 这通常是通过使用重尾巴前缀来实现的。 我们从理论上和数字上研究这种解决方案在回归中的效果, 之前关于系数的信息是以已知位置和比例参数密度函数的产物形式出现。 我们研究的是经常变换的尾巴(Studs分布), 日志变化的尾(Desgagn\'e (2015年) 中引入的), 并提议以较慢的尾巴衰的功能解决在回归框架下可能发生的任何冲突, 。 我们从理论上和数字上对前两种函数进行重复。