This paper focuses on parameter estimation and introduces a new method for lower bounding the Bayesian risk. The method allows for the use of virtually \emph{any} information measure, including R\'enyi's $\alpha$, $\varphi$-Divergences, and Sibson's $\alpha$-Mutual Information. The approach considers divergences as functionals of measures and exploits the duality between spaces of measures and spaces of functions. In particular, we show that one can lower bound the risk with any information measure by upper bounding its dual via Markov's inequality. We are thus able to provide estimator-independent impossibility results thanks to the Data-Processing Inequalities that divergences satisfy. The results are then applied to settings of interest involving both discrete and continuous parameters, including the ``Hide-and-Seek'' problem, and compared to the state-of-the-art techniques. An important observation is that the behaviour of the lower bound in the number of samples is influenced by the choice of the information measure. We leverage this by introducing a new divergence inspired by the ``Hockey-Stick'' Divergence, which is demonstrated empirically to provide the largest lower-bound across all considered settings. If the observations are subject to privatisation, stronger impossibility results can be obtained via Strong Data-Processing Inequalities. The paper also discusses some generalisations and alternative directions.
翻译:本文关注参数估计问题,提出了一种新的方法来下界贝叶斯风险。该方法允许使用任何信息度量,包括R\'enyi的$\alpha$、$\varphi$-差异和Sibson的$\alpha$-互信息。该方法将差异视为测度的泛函,并利用测度空间与函数空间之间的对偶关系。特别地,通过马尔科夫不等式,可以通过上界度量的对偶来下界任何信息度量的风险。因此,我们可以利用差异满足的数据处理不等式提供独立于估计器的不可能性结果。然后将这些结果应用于离散和连续参数的各种相关情景,包括“捉迷藏”问题,并与现有技术进行比较。一个重要观察是,下界在样本数方面的行为受信息度量选择的影响。我们通过引入一种灵感来自“曲棍球杆”差异的新差异来利用这一点,在所有考虑的情境中表现出最大的下界。如果观察值受到私有化的影响,则可以通过强数据处理不等式获得更强的不可能性结果。本文还讨论了一些推广和替代方向。