Mixture of experts (MoE) models are widely applied for conditional probability density estimation problems. We demonstrate the richness of the class of MoE models by proving denseness results in Lebesgue spaces, when inputs and outputs variables are both compactly supported. We further prove an almost uniform convergence result when the input is univariate. Auxiliary lemmas are proved regarding the richness of the soft-max gating function class, and their relationships to the class of Gaussian gating functions.
翻译:专家混合模型(MoE)被广泛应用于有条件的概率密度估计问题。我们通过证明Lebesgue空间的密度结果来证明MeE模型的丰富性,因为投入和产出变量都得到了紧密的支持。我们进一步证明,如果输入是univariate,就会产生几乎统一的趋同结果。对于软式峰值函数等级的丰富性及其与Gaussian 格子函数等级的关系,辅助性列姆玛斯被证明具有丰富的软式峰值函数等级及其关系。