This paper proposes a new multilinear projection method for dimension-reduction in modeling high-dimensional matrix-variate time series. It assumes that a $p_1\times p_2$ matrix-variate time series consists of a dynamically dependent, lower-dimensional matrix-variate factor process and a $p_1\times p_2$ matrix white noise series. Covariance matrix of the vectorized white noises assumes a Kronecker structure such that the row and column covariances of the noise all have diverging/spiked eigenvalues to accommodate the case of low signal-to-noise ratio often encountered in applications, such as in finance and economics. We use an iterative projection procedure to {reduce the dimensions and noise effects in estimating} front and back loading matrices and {to} obtain faster convergence rates than those of the traditional methods available in the literature. Furthermore, we introduce a two-way projected Principal Component Analysis to mitigate the diverging noise effects, and implement a high-dimensional white-noise testing procedure to estimate the dimension of the factor matrix. Asymptotic properties of the proposed method are established as the dimensions and sample size go to infinity. Simulated and real examples are used to assess the performance of the proposed method. We also compared the proposed method with some existing ones in the literature concerning the forecasting ability of the identified factors and found that the proposed approach fares well in out-of-sample forecasting.
翻译:暂无翻译