As a fundamental concept in information theory, mutual information ($MI$) has been commonly applied to quantify association between random vectors. Most existing nonparametric estimators of $MI$ have unstable statistical performance since they involve parameter tuning. We develop a consistent and powerful estimator, called fastMI, that does not incur any parameter tuning. Based on a copula formulation, fastMI estimates $MI$ by leveraging Fast Fourier transform-based estimation of the underlying density. Extensive simulation studies reveal that fastMI outperforms state-of-the-art estimators with improved estimation accuracy and reduced run time for large data sets. fastMI provides a powerful test for independence that exhibits satisfactory type I error control. Anticipating that it will be a powerful tool in estimating mutual information in a broad range of data, we develop an R package fastMI for broader dissemination.
翻译:暂无翻译