We study the problem of privately estimating the parameters of $d$-dimensional Gaussian Mixture Models (GMMs) with $k$ components. For this, we develop a technique to reduce the problem to its non-private counterpart. This allows us to privatize existing non-private algorithms in a blackbox manner, while incurring only a small overhead in the sample complexity and running time. As the main application of our framework, we develop an $(\varepsilon, \delta)$-differentially private algorithm to learn GMMs using the non-private algorithm of Moitra and Valiant [MV10] as a blackbox. Consequently, this gives the first sample complexity upper bound and first polynomial time algorithm for privately learning GMMs without any boundedness assumptions on the parameters.
翻译:我们研究用美元元件私下估算高斯混合模型参数的问题。 为此, 我们开发了一种技术, 将问题降低到非私人对手身上。 这使我们能够以黑盒方式将现有的非私人算法私有化, 同时在样本复杂程度和运行时间方面只产生少量间接成本。 作为我们框架的主要应用, 我们开发了一种以美元( varepsilon,\delta) 美元区分的私人算法, 以便用莫伊特拉和瓦利安特( MV10)的非私人算法作为黑盒来学习GMMM。 因此, 这使得首个样本复杂程度较高, 以及第一个用于私人学习GMMM的多元时间算法在参数上没有任何界限的假设。</s>