We study the inversion analog of the well-known Gauss algorithm for multiplying complex matrices. A simple version is $(A + iB)^{-1} = (A + BA^{-1}B)^{-1} - i A^{-1}B(A+BA^{-1} B)^{-1}$ when $A$ is invertible, which may be traced back to Frobenius but has received scant attention. We prove that it is optimal, requiring fewest matrix multiplications and inversions over the base field, and we extend it in three ways: (i) to any invertible $A + iB$ without requiring $A$ or $B$ be invertible; (ii) to any iterated quadratic extension fields, with $\mathbb{C}$ over $\mathbb{R}$ a special case; (iii) to Hermitian positive definite matrices $A + iB$ by exploiting symmetric positive definiteness of $A$ and $A + BA^{-1}B$. We call all such algorithms Frobenius inversions, which we will see do not follow from Sherman--Morrison--Woodbury type identities and cannot be extended to Moore--Penrose pseudoinverse. We show that a complex matrix with well-conditioned real and imaginary parts can be arbitrarily ill-conditioned, a situation tailor-made for Frobenius inversion. We prove that Frobenius inversion for complex matrices is faster than standard inversion by LU decomposition and Frobenius inversion for Hermitian positive definite matrices is faster than standard inversion by Cholesky decomposition. We provide extensive numerical experiments, applying Frobenius inversion to solve linear systems, evaluate matrix sign function, solve Sylvester equation, and compute polar decomposition, showing that Frobenius inversion can be more efficient than LU/Cholesky decomposition with negligible loss in accuracy. A side result is a generalization of Gauss multiplication to iterated quadratic extensions, which we show is intimately related to the Karatsuba algorithm for fast integer multiplication and multidimensional fast Fourier transform.
翻译:暂无翻译