We generalise the inference procedure for eigenvectors of symmetrizable matrices of Tyler (1981) to that of invariant and singular subspaces of non-diagonalizable matrices. Wald tests for invariant vectors and $t$-tests for their individual coefficients perform well in simulations, despite the matrix being not symmetric. Using these results, it is now possible to perform inference on network statistics that depend on eigenvectors of non-symmetric adjacency matrices as they arise in empirical applications from directed networks. Further, we find that statisticians only need control over the first-order Davis-Kahan bound to control convergence rates of invariant subspace estimators to higher-orders. For general invariant subspaces, the minimal eigenvalue separation dominates the first-order bound potentially slowing convergence rates considerably. In an example, we find that accounting for uncertainty in network estimates changes empirical conclusions about the ranking of nodes' popularity.
翻译:暂无翻译