Low Rank Approximation (hereafter LRA) of a matrix is a hot research subject, fundamental for Matrix and Tensor Computations and Big Data Mining and Analysis. Computations with LRA can be performed at sublinear cost, that is, by using much fewer memory cells and arithmetic operations than an input matrix has entries. Can we, however, compute LRA at sublinear cost? This is impossible for worst case inputs, but our sublinear cost deterministic variations of a popular randomized subspace sampling algorithms output accurate LRA of a large class of inputs, and in a sense of most of input matrices that admit LRA. This follows because we prove that with a high probability these deterministic algorithms output close LRA of a random input matrix that admits its LRA. Our numerical tests are in rather good accordance with our formal analysis. In other papers we propose and analyze other such algorithms for LRA and other important matrix computations.
翻译:信息总库的低级别代表(以下称“LARC”)是一个热点研究主题,对于矩阵和线性计算以及大数据开采和分析来说,是一个基本基础。与LARC的计算可以以亚线性成本进行,也就是说,使用比输入总库的条目少得多的内存细胞和算术操作。然而,我们能否用次线性成本计算LARC?对于最差的输入来说,这是不可能做到的。但是,对于流行的随机子空间取样算法的亚线性成本确定性变量,得出大量投入的准确的LARC,以及接受LARC的多数输入矩阵。这是因为我们证明这些确定性算法极有可能将承认其上帝军的随机输入矩阵连接LARC。我们的数字测试与我们的正式分析相当良好。在其它文件中,我们建议并分析关于LARC的其他类似算法和其他重要矩阵计算方法。