We introduce a general IFS Bayesian method for getting posterior probabilities from prior probabilities, and also a generalized Bayes' rule, which will contemplate a dynamical, as well as a non dynamical setting. Given a loss function ${l}$, we detail the prior and posterior items, their consequences and exhibit several examples. Taking $\Theta$ as the set of parameters and $Y$ as the set of data (which usually provides {random samples}), a general IFS is a measurable map $\tau:\Theta\times Y \to Y$, which can be interpreted as a family of maps $\tau_\theta:Y\to Y,\,\theta\in\Theta$. The main inspiration for the results we will get here comes from a paper by Zellner (with no dynamics), where Bayes' rule is related to a principle of minimization of {information.} We will show that our IFS Bayesian method which produces posterior probabilities (which are associated to holonomic probabilities) is related to the optimal solution of a variational principle, somehow corresponding to the pressure in Thermodynamic Formalism, and also to the principle of minimization of information in Information Theory.
翻译:我们引入了一般的IFS Bayesian 方法,从先前的概率中获取事后概率或概率, 以及通用的 Bayes 规则, 它将考虑动态和非动态环境。 如果损失函数为${l}, 我们详细列出前项和后项, 其后果并举几个例子。 以$\ theta$作为一组参数, 美元作为数据集( 通常提供{ 随机样本} ), 普通的 IFS 是一个可衡量的地图 $\ tau:\ Theta times Y\ to Y$, 它可以被解释为地图 $\ tau_theta: Y\ to Y\\\\\\\\\\\\\ thetheta\ the theta$。 我们在这里取得结果的主要灵感来自Zellner( 没有动态) 的论文, 其中Bayes 规则与最大限度地减少信息的原则 { { { { { } } 我们将显示我们的IFS Bayesian 方法 方法产生后期概率( 与holoomomic practivility) 原则有关, 最佳的信息稳定性原则也与最佳信息形式原则有关。