Inspired by the analysis of variance (ANOVA) decomposition of functions we propose a Gaussian-Uniform mixture model on the high-dimensional torus which relies on the assumption that the function we wish to approximate can be well explained by limited variable interactions. We consider three approaches, namely wrapped Gaussians, diagonal wrapped Gaussians and products of von Mises distributions. The sparsity of the mixture model is ensured by the fact that its summands are products of Gaussian-like density functions acting on low dimensional spaces and uniform probability densities defined on the remaining directions. To learn such a sparse mixture model from given samples, we propose an objective function consisting of the negative log-likelihood function of the mixture model and a regularizer that penalizes the number of its summands. For minimizing this functional we combine the Expectation Maximization algorithm with a proximal step that takes the regularizer into account. To decide which summands of the mixture model are important, we apply a Kolmogorov-Smirnov test. Numerical examples demonstrate the performance of our approach.
翻译:受差异分析( ANOVA) 不同功能分解的启发,我们提议在高维面体上采用高斯-统一混合模型,该模型所依据的假设是,我们希望接近的函数可以用有限的变量相互作用来很好地解释。我们考虑三种方法,即包盖高山、对角包盖高山和冯米斯分布的产物。混合物模型的广度,是由于它的相近密度函数是高斯类似密度函数的产物,在其余方向上定义的低维空间和统一概率密度。为了从特定样本中了解这种稀少的混合物模型,我们建议一种客观功能,由混合物模型的负日志相似功能和一种常规功能组成,惩罚其总和数。为了尽量减少这一功能,我们把期望最大化算法与正度计的准氧化法步骤结合起来。为了决定混合物模型的相配方很重要,我们应用一个 Kolmogorov-Smirnov 测试,我们的方法表现的数值示例。