In this work, we examine the prevalent use of Frobenius error minimization in covariance matrix cleaning. Currently, minimizing the Frobenius error offers a limited interpretation within information theory. To better understand this relationship, we focus on the Kullback-Leibler divergence as a measure of the information lost by the optimal estimators. Our analysis centers on rotationally invariant estimators for data following an inverse Wishart population covariance matrix, and we derive an analytical expression for their Kullback-Leibler divergence. Due to the intricate nature of the calculations, we use genetic programming regressors paired with human intuition. Ultimately, we establish a more defined link between the Frobenius error and information theory, showing that the former corresponds to a first-order expansion term of the Kullback-Leibler divergence.
翻译:暂无翻译