This paper focuses on parameter estimation and introduces a new method for lower bounding the Bayesian risk. The method allows for the use of virtually \emph{any} information measure, including R\'enyi's $\alpha$, $\varphi$-Divergences, and Sibson's $\alpha$-Mutual Information. The approach considers divergences as functionals of measures and exploits the duality between spaces of measures and spaces of functions. In particular, we show that one can lower bound the risk with any information measure by upper bounding its dual via Markov's inequality. We are thus able to provide estimator-independent impossibility results thanks to the Data-Processing Inequalities that divergences satisfy. The results are then applied to settings of interest involving both discrete and continuous parameters, including the ``Hide-and-Seek'' problem, and compared to the state-of-the-art techniques. An important observation is that the behaviour of the lower bound in the number of samples is influenced by the choice of the information measure. We leverage this by introducing a new divergence inspired by the ``Hockey-Stick'' Divergence, which is demonstrated empirically to provide the largest lower-bound across all considered settings. If the observations are subject to privatisation, stronger impossibility results can be obtained via Strong Data-Processing Inequalities. The paper also discusses some generalisations and alternative directions.
翻译:本文聚焦于参数估计问题,引入了一种通过信息度量下界贝叶斯风险的新方法。该方法允许使用几乎“任何”信息度量,包括Rényi的α、ϕ-分歧以及Sibson的α-互信息。该方法将分歧作为测度的泛函,并利用了测度空间与函数空间之间的对偶性。特别地,我们通过马尔可夫不等式来上界分歧的对偶物以对任何信息度量下界风险。因为分歧满足数据处理不等式,我们能够提供估算器独立的不可能结果。然后,我们将这些结果应用到有关离散和连续参数包括“捉迷藏”问题的相关领域,并将其与现有技术进行了比较。一个重要的观察是下界在样本数量方面的行为受到信息度量的选择的影响。我们利用这个观察引入了一种类似于“曲棍球杆”分歧的新分歧,并通过实验证明它在所有相关领域中提供了最大的下界。如果观测数据经过保护,则可以通过强数据处理不等式获得更强的不可能结果。此外,本文还讨论了一些推广和替代方法。