A variety of dimensionality reduction techniques have been applied for computations involving large matrices. The underlying matrix is randomly compressed into a smaller one, while approximately retaining many of its original properties. As a result, much of the expensive computation can be performed on the small matrix. The sketching of positive semidefinite (PSD) matrices is well understood, but there are many applications where the related matrices are not PSD, including Hessian matrices in non-convex optimization and covariance matrices in regression applications involving complex numbers. In this paper, we present novel dimensionality reduction methods for non-PSD matrices, as well as their ``square-roots", which involve matrices with complex entries. We show how these techniques can be used for multiple downstream tasks. In particular, we show how to use the proposed matrix sketching techniques for both convex and non-convex optimization, $\ell_p$-regression for every $1 \leq p \leq \infty$, and vector-matrix-vector queries.
翻译:在涉及大型矩阵的计算中应用了多种维度减少技术。 基础矩阵随机压缩成一个较小的矩阵, 并大致保留了其许多原始特性。 因此, 大部分昂贵的计算可以在小矩阵中进行。 正面半无线矩阵的草图很能理解, 但许多应用程序相关的矩阵不是私营部门司, 包括非电解优化中的赫森矩阵和涉及复杂数字的回归应用中的共变矩阵。 在本文中, 我们介绍了非PSD矩阵及其“ 平方” 的新型维度减少方法, 以及包含复杂条目的矩阵。 我们展示了这些技术如何用于多个下游任务。 特别是, 我们展示了如何使用拟议的矩阵图解技术, 用于每1美元\leq p\leq\leq\ infty$, 以及矢量矩阵查询 。