This article expands the framework of Bayesian inference and provides direct probabilistic methods for approaching inference tasks that are typically handled with information theory. We treat Bayesian probability updating as a random process and uncover intrinsic quantitative features of joint probability distributions called inferential moments. Inferential moments quantify shape information about how a prior distribution is expected to update in response to yet to be obtained information. Further, we quantify the unique probability distribution whose statistical moments are the inferential moments in question. We find a power series expansion of the mutual information in terms of inferential moments, which implies a connection between inferential theoretic logic and elements of information theory. Of particular interest is the inferential deviation, which is the expected variation of the probability of one variable in response to an inferential update of another. We explore two applications that analyze the inferential deviations of a Bayesian network to improve decision-making. We implement simple greedy algorithms for exploring sensor tasking using inferential deviations that generally outperform similar greedy mutual information algorithms in terms of root mean squared error between epistemic probability estimates and the ground truth probabilities they are estimating.
翻译:暂无翻译