Multilevel data are prevalent in many real-world applications. However, it remains an open research problem to identify and justify a class of models that flexibly capture a wide range of multilevel data. Motivated by the versatility of the mixture of experts (MoE) models in fitting regression data, in this article we extend upon the MoE and study a class of mixed MoE (MMoE) models for multilevel data. Under some regularity conditions, we prove that the MMoE is dense in the space of any continuous mixed effects models in the sense of weak convergence. As a result, the MMoE has a potential to accurately resemble almost all characteristics inherited in multilevel data, including the marginal distributions, dependence structures, regression links, random intercepts and random slopes. In a particular case where the multilevel data is hierarchical, we further show that a nested version of the MMoE universally approximates a broad range of dependence structures of the random effects among different factor levels.
翻译:多层次数据在许多现实世界应用中很普遍,然而,对于确定和证明灵活捕捉广泛多层次数据的各类模型来说,这仍然是一个公开的研究问题,因为专家(MOE)混合模型在适当回归数据时具有多功能性,在本篇文章中,我们向教育部推广了多层次数据,并研究了一组混合的MOE(MMOE)模型,在某些常规条件下,我们证明MMOE在任何连续的混合效应模型的空间中非常密集,而这种模型的趋同感是弱的。因此,MMOE有可能准确地与多层次数据中遗留下来的几乎所有特征相近,包括边缘分布、依赖结构、回归链接、随机拦截和随机斜坡。在多层次数据为等级的特定情况下,我们进一步表明MMOE的嵌套版本普遍接近不同因素之间随机效应的广泛依赖结构。