"Toeplitzification" or "redundancy (spatial) averaging", the well-known routine for deriving the Toeplitz covariance matrix estimate from the standard sample covariance matrix, recently regained new attention due to the important Random Matrix Theory (RMT) findings. The asymptotic consistency in the spectral norm was proven for the Kolmogorov's asymptotics when the matrix dimension N and independent identically distributed (i.i.d.) sample volume T both tended to infinity (N->inf, T->inf, T/N->c > 0). These novel RMT results encouraged us to reassess the well-known drawback of the redundancy averaging methodology, which is the generation of the negative minimal eigenvalues for covariance matrices with big eigenvalues spread, typical for most covariance matrices of interest. We demonstrate that for this type of Toeplitz covariance matrices, convergence in the spectral norm does not prevent the generation of negative eigenvalues, even for the sample volume T that significantly exceeds the covariance matrix dimension (T >> N). We demonstrate that the ad-hoc attempts to remove the negative eigenvalues by the proper diagonal loading result in solutions with the very low likelihood. We demonstrate that attempts to exploit Newton's type iterative algorithms, designed to produce a Hermitian Toeplitz matrix with the given eigenvalues lead to the very poor likelihood of the very slowly converging solution to the desired eigenvalues. Finally, we demonstrate that the proposed algorithm for restoration of a positive definite (p.d.) Hermitian Toeplitz matrix with the specified Maximum Entropy spectrum, allows for the transformation of the (unstructured) Hermitian maximum likelihood (ML) sample matrix estimate in a p.d. Toeplitz matrix with sufficiently high likelihood.
翻译:暂无翻译