Estimating mutual information between continuous random variables is often intractable and extremely challenging for high-dimensional data. Recent progress has leveraged neural networks to optimize variational lower bounds on mutual information. Although showing promise for this difficult problem, the variational methods have been theoretically and empirically proven to have serious statistical limitations: 1) many methods struggle to produce accurate estimates when the underlying mutual information is either low or high; 2) the resulting estimators may suffer from high variance. Our approach is based on training a classifier that provides the probability that a data sample pair is drawn from the joint distribution rather than from the product of its marginal distributions. Moreover, we establish a direct connection between mutual information and the average log odds estimate produced by the classifier on a test set, leading to a simple and accurate estimator of mutual information. We show theoretically that our method and other variational approaches are equivalent when they achieve their optimum, while our method sidesteps the variational bound. Empirical results demonstrate high accuracy of our approach and the advantages of our estimator in the context of representation learning. Our demo is available at https://github.com/RayRuizhiLiao/demi_mi_estimator.
翻译:估计连续随机变量之间的相互信息对于高维数据来说往往是棘手的,而且极具挑战性。最近的进展利用神经网络优化了相互信息的变差下限。虽然显示对这个困难问题的希望,但变差方法在理论上和经验上证明具有严重的统计限制:(1) 当基础相互信息低或高时,许多方法在努力得出准确的估计数;(2) 由此得出的估计者可能会受到高度差异的影响。我们的方法基于培训一个分类器,该分类器提供从联合分布中而不是从其边际分布中产品中提取数据样本的概率。此外,我们还在相互信息与分类者在测试集中得出的平均日数估计值之间建立了直接联系,导致对相互信息进行简单准确的估算。我们理论上表明,我们的方法和其他变差方法在达到最佳时是相等的,而我们的方法则绕过差异的界限。我们的方法显示我们的方法的高度准确性和我们估算器在代表性学习过程中的优势。我们的演算在https://github.com/Rayruhiia_RimiaLamiare学习中可以使用。