Due to their importance in both data analysis and numerical algorithms, low rank approximations have recently been widely studied. They enable the handling of very large matrices. Tight error bounds for the computationally efficient Gaussian elimination based methods (skeleton approximations) are available. In practice, these bounds are useful for matrices with singular values which decrease quickly. Using the Chebyshev norm, this paper provides improved bounds for the errors of the matrix elements. These bounds are substantially better in the practically relevant cases where the eigenvalues decrease polynomially. Results are proven for general real rectangular matrices. Even stronger bounds are obtained for symmetric positive definite matrices. A simple example is given, comparing these new bounds to earlier ones.
翻译:由于低级近似值在数据分析和数字算法中的重要性,最近对低级近似值进行了广泛研究,从而能够处理非常大的矩阵。可以找到计算效率高斯消除方法(Skeleton近似值)的严格误差界限。在实践中,这些界限对于单值迅速下降的矩阵是有用的。使用Chebyshev 规范,本文件为矩阵元素的错误提供了更好的界限。在实际相关的案例中,这些界限大得多,因为电子元值的多元性下降。结果被证明为通用的矩形矩阵。对于正数正数确定矩阵,甚至获得了更强的界限。给出了一个简单的例子,将这些新的界限与早先的矩阵进行比较。