While matrix variate regression models have been studied in many existing works, classical statistical and computational methods for the analysis of the regression coefficient estimation are highly affected by high dimensional and noisy matrix-valued predictors. To address these issues, this paper proposes a framework of matrix variate regression models based on a rank constraint, vector regularization (e.g., sparsity), and a general loss function with three special cases considered: ordinary matrix regression, robust matrix regression, and matrix logistic regression. We also propose an alternating projected gradient descent algorithm. Based on analyzing our objective functions on manifolds with bounded curvature, we show that the algorithm is guaranteed to converge, all accumulation points of the iterates have estimation errors in the order of $O(1/\sqrt{n})$ asymptotically and substantially attaining the minimax rate. Our theoretical analysis can be applied to general optimization problems on manifolds with bounded curvature and can be considered an important technical contribution to this work. We validate the proposed method through simulation studies and real image data examples.
翻译:虽然在许多现有著作中研究了矩阵变差回归模型,但分析回归系数估计的典型统计和计算方法受到高维和噪音矩阵估值预测器的严重影响。为解决这些问题,本文件提议了一个矩阵变差回归模型框架,其依据是等级限制、矢量规范(例如宽度)和一般损失函数,其中考虑了三个特殊情况:普通矩阵回归、强力矩阵回归和矩阵物流回归。我们还提议了一种轮流预测梯度下沉算法。根据对带有捆绑曲线的方块的客观功能的分析,我们证明算法可以保证汇合,迭代国的所有积累点在估计值方面都有误差,其幅度为O(1/sqrt{n})美元左右,且基本和基本达到最小速率。我们的理论分析可用于对带有捆绑曲线的方块的一般优化问题,并可被视为对这项工作的重要技术贡献。我们通过模拟研究和真实图像数据示例验证了拟议方法。