We introduce a general IFS Bayesian method for getting posterior probabilities from prior probabilities, and also a generalized Bayes' rule, which will contemplate a dynamical, as well as a non-dynamical setting. Given a loss function ${l}$, we detail the prior and posterior items, their consequences and exhibit several examples. Taking $\Theta$ as the set of parameters and $Y$ as the set of data (which usually provides {random samples}), a general IFS is a measurable map $\tau:\Theta\times Y \to Y$, which can be interpreted as a family of maps $\tau_\theta:Y\to Y,\,\theta\in\Theta$. The main inspiration for the results we will get here comes from a paper by Zellner (with no dynamics), where Bayes' rule is related to a principle of minimization of {information.} We will show that our IFS Bayesian method which produces posterior probabilities (which are associated to holonomic probabilities) is related to the optimal solution of a variational principle, somehow corresponding to the pressure in Thermodynamic Formalism, and also to the principle of minimization of information in Information Theory. Among other results, we present the prior dynamical elements and we derive the corresponding posterior elements via the Ruelle operator of Thermodynamic Formalism; getting in this way a form of dynamical Bayes' rule.
翻译:暂无翻译