In the Multiple Measurements Vector (MMV) model, measurement vectors are connected to unknown, jointly sparse signal vectors through a linear regression model employing a single known measurement matrix (or dictionary). Typically, the number of atoms (columns of the dictionary) is greater than the number measurements and the sparse signal recovery problem is generally ill-posed. In this paper, we treat the signals and measurement noise as independent Gaussian random vectors with unknown signal covariance matrix and noise variance, respectively, and derive fixed point (FP) equation for solving the likelihood equation for signal powers, thereby enabling the recovery of the sparse signal support (sources with non-zero variances). Two practical algorithms, a block coordinate descent (BCD) and a cyclic coordinate descent (CCD) algorithms, that leverage on the FP characterization of the likelihood equation are then proposed. Additionally, a greedy pursuit method, analogous to popular simultaneous orthogonal matching pursuit (OMP), is introduced. Our numerical examples demonstrate effectiveness of the proposed covariance learning (CL) algorithms both in classic sparse signal recovery as well as in direction-of-arrival (DOA) estimation problems where they perform favourably compared to the state-of-the-art algorithms under a broad variety of settings.
翻译:暂无翻译