Principal Component Analysis (PCA) is a critical tool for dimensionality reduction and data analysis. This paper revisits PCA through the lens of generalized spiked covariance and correlation models, which allow for more realistic and complex data structures. We explore the asymptotic properties of the sample principal components (PCs) derived from both the sample covariance and correlation matrices, focusing on how data normalization-an essential step for scale-invariant analysis-affects these properties. Our results reveal that while normalization does not alter the first-order limits of spiked eigenvalues and eigenvectors, it significantly influences their second-order behavior. We establish new theoretical findings, including a joint central limit theorem for bilinear forms of the sample covariance matrix's resolvent and diagonal entries, providing a robust framework for understanding spiked models in high dimensions. Our theoretical results also reveal an intriguing phenomenon regarding the effect of data normalization when the variances of covariates are equal. Specifically, they suggest that high-dimensional PCA based on the correlation matrix may not only perform comparably to, but potentially even outperform, PCA based on the covariance matrix-particularly when the leading principal component is sufficiently large. This study not only extends the existing literature on spiked models but also offers practical guidance for applying PCA in real-world scenarios, particularly when dealing with normalized data.
翻译:暂无翻译