We consider the problem of estimation of a covariance matrix for Gaussian data in a high dimensional setting. Existing approaches include maximum likelihood estimation under a pre-specified sparsity pattern, l_1-penalized loglikelihood optimization and ridge regularization of the sample covariance. We show that these three approaches can be addressed in an unified way, by considering the constrained optimization of an objective function that involves two suitably defined penalty terms. This unified procedure exploits the advantages of each individual approach, while bringing novelty in the combination of the three. We provide an efficient algorithm for the optimization of the regularized objective function and describe the relationship between the two penalty terms, thereby highlighting the importance of the joint application of the three methods. A simulation study shows how the sparse estimates of covariance matrices returned by the procedure are stable and accurate, both in low and high dimensional settings, and how their calculation is more efficient than existing approaches under a partially known sparsity pattern. An illustration on sonar data shows is presented for the identification of the covariance structure among signals bounced off a certain material. The method is implemented in the publicly available R package gicf.
翻译:暂无翻译