In this work, we study a tensor-structured random sketching matrix to project a large-scale convex optimization problem to a much lower-dimensional counterpart, which leads to huge memory and computation savings. We show that while maintaining the prediction error between a random estimator and the true solution with high probability, the dimension of the projected problem obtains optimal dependence in terms of the geometry of the constraint set. Moreover, the tensor structure and sparsity pattern of the structured random matrix yields extra computational advantage. Our analysis is based on probability chaining theory, which allows us to obtain an almost sharp estimate for the sketching dimension of convex optimization problems. Consequences of our main result are demonstrated in a few concrete examples, including unconstrained linear regressions and sparse recovery problems.
翻译:在这项工作中,我们研究一个有压力结构的随机草图矩阵,将大规模二次曲线优化问题投射到一个低维的对应方,从而节省大量记忆和计算。我们表明,在维持随机估计器与极有可能的真正解决方案之间的预测错误的同时,预测问题的规模在约束装置的几何学上获得了最佳依赖性。此外,结构随机矩阵的振幅结构和宽度模式产生了额外的计算优势。我们的分析基于概率链化理论,使我们能够对二次曲线优化问题的轮廓层面获得几乎敏锐的估计。我们的主要结果在几个具体例子中得到了体现,其中包括未受限制的线性回归和零散的恢复问题。