We study low-rank matrix regression in settings where matrix-valued predictors and scalar responses are observed across multiple individuals. Rather than assuming a fully homogeneous coefficient matrices across individuals, we accommodate shared low-dimensional structure alongside individual-specific deviations. To this end, we introduce a tensor-structured homogeneity pursuit framework, wherein each coefficient matrix is represented as a product of shared low-rank subspaces and individualized low-rank loadings. We propose a scalable estimation procedure based on scaled gradient descent, and establish non-asymptotic bounds demonstrating that the proposed estimator attains improved convergence rates by leveraging shared information while preserving individual-specific signals. The framework is further extended to incorporate scaled hard thresholding for recovering sparse latent structures, with theoretical guarantees in both linear and generalized linear model settings. Our approach provides a principled middle ground between fully pooled and fully separate analyses, achieving strong theoretical performance, computational tractability, and interpretability in high-dimensional multi-individual matrix regression problems.
翻译:暂无翻译