Based on the analysis of variance (ANOVA) decomposition of functions which relies on the assumption that the function we wish to approximate can be well explained by limited variable interaction, we propose a sparse Gaussian-like mixture model on the high dimensional torus. We consider three approaches, namely wrapped Gaussians, diagonal wrapped Gaussians and products of von Mises distributions. The sparsity of the mixture model is ensured by the fact that its summands are products of Gaussian-like density functions acting on low dimensional spaces and uniform probability densities defined on the remaining directions. To learn such a sparse mixture model from given samples, we propose an objective function consisting of the negative log-likelihood function of the mixture model and a regularizer that penalizes the number of its summands. For minimizing this functional we combine the Expectation Maximization algorithm with a proximal step that takes the regularizer into account. To decide which summands of the mixture model are important, we apply a Kolmogorov-Smirnov test. Numerical example demonstrate the performance of our approach.
翻译:根据对差异(ANOVA)分解功能的分析,即我们所希望的功能可以用有限的可变相互作用来很好地解释我们所希望的功能,根据这种假设,我们提议在高维横横柱上采用稀少的高斯类混合物模型。我们考虑三种方法,即包盖高山、双角包裹高斯和冯米泽斯分布的产物。混合物模型的宽度在于其总和是高斯类似密度函数的产物,在余下的方向上界定的低维度空间和统一概率密度。为了从特定样本中了解这种稀少的混合模型,我们建议一种客观功能,包括混合物模型的负日志类功能和一种常规功能,惩罚其总和的数量。为了最大限度地减少这一功能,我们把期望最大化算法与一个考虑到正规化器的准轴步骤结合起来。为了决定混合物模型的组合和总和重要性,我们应用一个 Kolmogorov-Smirnov 测试来说明我们的方法的绩效。