This work considers the low-rank approximation of a matrix $A(t)$ depending on a parameter $t$ in a compact set $D \subset \mathbb{R}^d$. Application areas that give rise to such problems include computational statistics and dynamical systems. Randomized algorithms are an increasingly popular approach for performing low-rank approximation and they usually proceed by multiplying the matrix with random dimension reduction matrices (DRMs). Applying such algorithms directly to $A(t)$ would involve different, independent DRMs for every $t$, which is not only expensive but also leads to inherently non-smooth approximations. In this work, we propose to use constant DRMs, that is, $A(t)$ is multiplied with the same DRM for every $t$. The resulting parameter-dependent extensions of two popular randomized algorithms, the randomized singular value decomposition and the generalized Nystr\"{o}m method, are computationally attractive, especially when $A(t)$ admits an affine linear decomposition with respect to $t$. We perform a probabilistic analysis for both algorithms, deriving bounds on the expected value as well as failure probabilities for the $L^2$ approximation error when using Gaussian random DRMs. Both, the theoretical results and numerical experiments, show that the use of constant DRMs does not impair their effectiveness; our methods reliably return quasi-best low-rank approximations.
翻译:这项工作考虑到一个基质 $A( t) 的低位近似值, 取决于一个总基值 $D $t 的参数 $t$, 这不但昂贵, 也会导致内在的非超值近似值。 在这项工作中, 我们提议使用恒定的DRM, 即$A( t) 美元乘以每美元相同的DRM。 随机化算法是一个越来越受欢迎的方法, 用来进行低端近似值, 通常以随机减少维度矩阵( DMS) 乘以矩阵。 将这种算法直接应用到$A( t) 美元。 将每美元直接应用到$( t) 美元, 这需要不同的独立的DRM, 不仅昂贵, 也会导致本性的非超值近效近效近效近效近效。 我们建议使用恒定的DRMM( $), 美元乘以相同的DRMM, 由此导致两种流行的随机随机随机化算法的参数延伸扩展, 奇特值和通用的Nystrar\\ {mall 方法具有计算吸引力, $ADRM2 的逻辑分析, 。 我们用一个固定性直值分析方法,,, 的精确性地算算法,, 以正值的精确性地算法,,, 的逻辑性地算算法, 。</s>