Gaussian mixture models are widely used to model data generated from multiple latent sources. Despite its popularity, most theoretical research assumes that the labels are either independent and identically distributed, or follows a Markov chain. It remains unclear how the fundamental limits of estimation change under more complex dependence. In this paper, we address this question for the spherical two-component Gaussian mixture model. We first show that for labels with an arbitrary dependence, a naive estimator based on the misspecified likelihood is $\sqrt{n}$-consistent. Additionally, under labels that follow an Ising model, we establish the information theoretic limitations for estimation, and discover an interesting phase transition as dependence becomes stronger. When the dependence is smaller than a threshold, the optimal estimator and its limiting variance exactly matches the independent case, for a wide class of Ising models. On the other hand, under stronger dependence, estimation becomes easier and the naive estimator is no longer optimal. Hence, we propose an alternative estimator based on the variational approximation of the likelihood, and argue its optimality under a specific Ising model.
翻译:暂无翻译