We consider the problem of parameter estimation in a Bayesian setting and propose a general lower-bound that includes part of the family of $f$-Divergences. The results are then applied to specific settings of interest and compared to other notable results in the literature. In particular, we show that the known bounds using Mutual Information can be improved by using, for example, Maximal Leakage, Hellinger divergence, or generalizations of the Hockey-Stick divergence.
翻译:我们考虑在巴伊西亚环境下的参数估计问题,并提出一个一般性的下限,包括家庭的一部分(f$-Diverences),然后将结果应用于特定的利益环境,并与文献中的其他显著成果进行比较,特别是,我们表明,通过使用“最大泄漏”、“海灵格”差异或“曲棍球-曲棍球”差异的概括等方法,可以改进使用“相互信息”的已知界限。