Efficient and accurate low-rank approximation (LRA) methods are of great significance for large-scale data analysis. Randomized tensor decompositions have emerged as powerful tools to meet this need, but most existing methods perform poorly in the presence of noise interference. Inspired by the remarkable performance of randomized block Krylov iteration (rBKI) in reducing the effect of tail singular values, this work designs an rBKI-based Tucker decomposition (rBKI-TK) for accurate approximation, together with a hierarchical tensor ring decomposition based on rBKI-TK for efficient compression of large-scale data. Besides, the error bound between the deterministic LRA and the randomized LRA is studied. Numerical experiences demonstrate the efficiency, accuracy and scalability of the proposed methods in both data compression and denoising.
翻译:对大规模数据分析来说,高效和准确的低排序近似法(LARC)对于大规模数据分析具有重大意义; 随机的沙子分解已成为满足这一需要的有力工具,但大多数现有方法在噪音干扰下表现不佳; 由于随机的块块块Krylov迭代(rBKI)在减少尾端单值影响方面的出色表现,这项工作设计了一个基于RBKI的塔库分解法(rBKI-TK),用于准确近似,同时设计了一个基于rBKI-TK的沙子分解管分解法,用于高效压缩大型数据; 此外,正在研究确定性上帝军与随机化上帝军之间的错误; 数字经验显示,拟议方法在数据压缩和分解两方面的效率、准确性和可伸缩性。