This paper studies the sample complexity of learning the $k$ unknown centers of a balanced Gaussian mixture model (GMM) in $\mathbb{R}^d$ with spherical covariance matrix $\sigma^2\mathbf{I}$. In particular, we are interested in the following question: what is the maximal noise level $\sigma^2$, for which the sample complexity is essentially the same as when estimating the centers from labeled measurements? To that end, we restrict attention to a Bayesian formulation of the problem, where the centers are uniformly distributed on the sphere $\sqrt{d}\mathcal{S}^{d-1}$. Our main results characterize the exact noise threshold $\sigma^2$ below which the GMM learning problem, in the large system limit $d,k\to\infty$, is as easy as learning from labeled observations, and above which it is substantially harder. The threshold occurs at $\frac{\log k}{d} = \frac12\log\left( 1+\frac{1}{\sigma^2} \right)$, which is the capacity of the additive white Gaussian noise (AWGN) channel. Thinking of the set of $k$ centers as a code, this noise threshold can be interpreted as the largest noise level for which the error probability of the code over the AWGN channel is small. Previous works on the GMM learning problem have identified the minimum distance between the centers as a key parameter in determining the statistical difficulty of learning the corresponding GMM. While our results are only proved for GMMs whose centers are uniformly distributed over the sphere, they hint that perhaps it is the decoding error probability associated with the center constellation as a channel code that determines the statistical difficulty of learning the corresponding GMM, rather than just the minimum distance.
翻译:本文研究以 $\ mathbb{ R ⁇ d$ 以球形共变量矩阵$\ sgma2\\ mathbf{I} 美元学习平衡高斯混合模型( GMMM) 的 美元未知中心( GMM ) 的样本复杂性。 特别是, 我们感兴趣的问题是: 最大噪音水平是多少? $\ gma# 2美元, 其样本复杂性基本上与从标签测量中估算中心时相同? 为此, 我们只关注问题Bayesian 的配方( GMM ), 其中中心统一分布在球体上 $\ sqrt{d\\\ mathcal{dal{S\\ d-1} 美元。 我们的主要结果是精确的噪声阈值是$\ gma# 2$ 。 在大系统中, GMWM 学习的最小噪音水平, 只需从标签观察中学习, 而上面的值是相当的。 URMMLOM 的代码可能位于 12\ lex centrentral lex, lex lax cre droisal lax lax lax lax lax lax lax a lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lax lex lax lex lex lex lax lax lex lex lex lex) lax lax lax lax lad d d d d d d d d d d dre lax lax lax lax lax lax lax lax) lax lax lax lax lax