There is currently a renewed interest in the Bayesian predictive approach to statistics. This paper offers a review on foundational concepts and focuses on predictive modeling, which by directly reasoning on prediction, bypasses inferential models or may characterize them. We detail predictive characterizations in exchangeable and partially exchangeable settings, for a large variety of data structures, and hint at new directions. The underlying concept is that Bayesian predictive rules are probabilistic learning rules, formalizing through conditional probability how we learn on future events given the available information. This concept has implications in any statistical problem and in inference, from classic contexts to less explored challenges, such as providing Bayesian uncertainty quantification to predictive algorithms in data science, as we show in the last part of the paper. The paper gives a historical overview, but also includes a few new results, presents some recent developments and poses some open questions.
翻译:暂无翻译