We study the deviation inequality for a sum of high-dimensional random matrices and operators with dependence and arbitrary heavy tails. There is an increase in the importance of the problem of estimating high-dimensional matrices, and dependence and heavy-tail properties of data are among the most critical topics currently. In this paper, we derive a dimension-free upper bound on the deviation, that is, the bound does not depend explicitly on the dimension of matrices, but depends on their effective rank. Our result is a generalization of several existing studies on the deviation of the sum of matrices. Our proof is based on two techniques: (i) a variational approximation of the dual of moment generating functions, and (ii) robustification through truncation of eigenvalues of matrices. We show that our results are applicable to several problems such as covariance matrix estimation, hidden Markov models, and overparameterized linear regression models.
翻译:我们研究高维随机矩阵和依赖性及任意重尾的操作员的总和的偏差不平等性,估计高维矩阵问题的重要性越来越严重,数据的依赖性和重尾特性是目前最重要的课题之一。我们在本报告中对偏差得出一个无维度上限,即约束并不明确取决于矩阵的维度,而是取决于其有效等级。我们的结果是概括了有关矩阵总和偏差的几项现有研究。我们的证据基于两种技术:(一) 瞬时生成功能的双倍变化近近似,和(二) 通过对矩阵的均分值进行脱轨来强化数据。我们表明,我们的结果适用于数个问题,如共变矩阵估计、隐藏的马尔科夫模型和超度的线性回归模型。