Matrix perturbation bounds (such as Weyl and Davis-Kahan) are frequently used in many branches of mathematics. Most of the classical results in this area are optimal, in the worst-case analysis. However, in modern applications, both the ground and the nose matrices frequently have extra structural properties. For instance, it is often assumed that the ground matrix is essentially low rank, and the nose matrix is random or pseudo-random. We aim to rebuild a part of perturbation theory, adapting to these modern assumptions. We will do this using a contour expansion argument, which enables us to exploit the skewness among the leading eigenvectors of the ground and the noise matrix (which is significant when the two are uncorrelated) to our advantage. In the current paper, we focus on the perturbation of eigenspaces. This helps us to introduce the arguments in the cleanest way, avoiding the more technical consideration of the general case. In applications, this case is also one of the most useful. More general results appear in a subsequent paper. Our method has led to several improvements, which have direct applications in central problems. Among others, we derive a sharp result for the perturbation of a low rank matrix with random perturbation, answering an open question in this area. Next, we derive new results concerning the spike model, an important model in statistics, bridging two different directions of current research. Finally, we use our results on the perturbation of eigenspaces to derive new results concerning eigenvalues of deterministic and random matrices. In particular, we obtain new results concerning the outliers in the deformed Wigner model and the least singular value of random matrices with non-zero mean.
翻译:暂无翻译