The prediction of the variance-covariance matrix of the multivariate normal distribution is important in the multivariate analysis. We investigated Bayesian predictive distributions for Wishart distributions under the Kullback-Leibler divergence. The conditional reducibility of the family of Wishart distributions enables us to decompose the risk of a Bayesian predictive distribution. We considered a recently introduced class of prior distributions, which is called the family of enriched standard conjugate prior distributions, and compared the Bayesian predictive distributions based on these prior distributions. Furthermore, we studied the performance of the Bayesian predictive distribution based on the reference prior distribution in the family and showed that there exists a prior distribution in the family that dominates the reference prior distribution. Our study provides new insight into the multivariate analysis when there exists an ordered inferential importance for the independent variables.
翻译:在多变量分析中,预测多变正常分布的差异差变矩阵很重要。 我们调查了巴伊西亚预测分布情况,根据Kullback-Lebel差数对Wishart分布情况进行了预测。Wishart分布家庭有条件的减少使我们得以分解Bayesian预测分布的风险。我们考虑了最近引进的先前分配情况,即所谓的浓缩标准共产先前分配情况,并比较了基于这些先前分配情况的Bayesian预测分布情况。此外,我们研究了巴伊西亚预测分布的绩效,根据先前家庭分布的参照情况,并表明先前的家庭分配情况主导了先前分配情况。我们的研究对多变式分析提供了新的见解,当存在对独立变量的必然重要性时,我们的研究对多变式分析提供了新的见解。