We discuss two approaches to solving the parametric (or stochastic) eigenvalue problem. One of them uses a Taylor expansion and the other a Chebyshev expansion. The parametric eigenvalue problem assumes that the matrix $A$ depends on a parameter $\mu$, where $\mu$ might be a random variable. Consequently, the eigenvalues and eigenvectors are also functions of $\mu$. We compute a Taylor approximation of these functions about $\mu_{0}$ by iteratively computing the Taylor coefficients. The complexity of this approach is $O(n^{3})$ for all eigenpairs, if the derivatives of $A(\mu)$ at $\mu_{0}$ are given. The Chebyshev expansion works similarly. We first find an initial approximation iteratively which we then refine with Newton's method. This second method is more expensive but provides a good approximation over the whole interval of the expansion instead around a single point. We present numerical experiments confirming the complexity and demonstrating that the approaches are capable of tracking eigenvalues at intersection points. Further experiments shed light on the limitations of the Taylor expansion approach with respect to the distance from the expansion point $\mu_{0}$.
翻译:我们讨论两种方法来解决参数(或随机值)值问题。 其中一种方法使用泰勒扩张,另一种方法使用Chebyshev扩张。 参数乙值问题假定矩阵$A$取决于一个参数$\mu$, 其中美元可能是一个随机变量。 因此, egen值和源值也是一种美元函数。 我们通过迭接计算泰勒系数来计算这些函数的Taylor近似值大约$\mu ⁇ 0美元。 这种方法的复杂性是所有egenpairs的O( n ⁇ 3})$。 如果给出了美元( mu) $的衍生物, 则该矩阵值取决于一个参数$\mu$, 美元可能是一个随机变量。 Chebyshev 扩张也同样地工作。 我们首先发现一个初始近似值, 然后用牛顿的方法来精细化。 第二种方法比较昂贵, 但它在扩展整个时间间隔上提供了一个很好的近似值。 我们用数字实验来证实其复杂性, 并证明这些方法能够追踪从 $\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\的交叉点的远值的扩展的扩张限制。