As a fundamental concept in information theory, mutual information ($MI$) has been commonly applied to quantify the association between random vectors. Most existing nonparametric estimators of $MI$ have unstable statistical performance since they involve parameter tuning. We develop a consistent and powerful estimator, called \texttt{fastMI}, that does not incur any parameter tuning. Based on a copula formulation, \texttt{fastMI} estimates $MI$ by leveraging Fast Fourier transform-based estimation of the underlying density. Extensive simulation studies reveal that \texttt{fastMI} outperforms state-of-the-art estimators with improved estimation accuracy and reduced run time for large data sets. \texttt{fastMI} provides a powerful test for independence that exhibits satisfactory type I error control. Anticipating that it will be a powerful tool in estimating mutual information in a broad range of data, we develop an R package \texttt{fastMI} for broader dissemination.
翻译:暂无翻译