Latent space models play an important role in the modeling and analysis of network data. Under these models, each node has an associated latent point in some (typically low-dimensional) geometric space, and network formation is driven by this unobserved geometric structure. The random dot product graph (RDPG) and its generalization (GRDPG) are latent space models under which this latent geometry is taken to be Euclidean. These latent vectors can be efficiently and accurately estimated using well-studied spectral embeddings. In this paper, we develop a minimax lower bound for estimating the latent positions in the RDPG and the GRDPG models under the two-to-infinity norm, and show that a particular spectral embedding method achieves this lower bound. We also derive a minimax lower bound for the related task of subspace estimation under the two-to-infinity norm that holds in general for low-rank plus noise network models, of which the RDPG and GRDPG are special cases. The lower bounds are achieved by a novel construction based on Hadamard matrices.
翻译:暂无翻译