Bayes' rule has enabled innumerable powerful algorithms of statistical signal processing and statistical machine learning. However, when there exist model misspecifications in prior distributions and/or data distributions, the direct application of Bayes' rule is questionable. Philosophically, the key is to balance the relative importance of prior and data distributions when calculating posterior distributions: if prior (resp. data) distributions are overly conservative, we should upweight the prior belief (resp. data evidence); if prior (resp. data) distributions are overly opportunistic, we should downweight the prior belief (resp. data evidence). This paper derives a generalized Bayes' rule, called uncertainty-aware Bayes' rule, to technically realize the above philosophy, i.e., to combat the model uncertainties in prior distributions and/or data distributions. Simulated and real-world experiments showcase the superiority of the presented uncertainty-aware Bayes' rule over the conventional Bayes' rule: In particular, the uncertainty-aware Kalman filter, the uncertainty-aware particle filter, and the uncertainty-aware interactive multiple model filter are suggested and validated.
翻译:暂无翻译