This article considers to model large-dimensional matrix time series by introducing a regression term to the matrix factor model. This is an extension of classic matrix factor model to incorporate the information of known factors or useful covariates. We establish the convergence rates of coefficient matrix, loading matrices and the signal part. The theoretical results coincide with the rates in Wang et al. (2019). We conduct numerical studies to verify the performance of our estimation procedure in finite samples. Finally, we demonstrate the superiority of our proposed model using the daily returns of stocks data.
翻译:暂无翻译