Gaussian processes are a key component of many flexible statistical and machine learning models. However, they exhibit cubic computational complexity and high memory constraints due to the need of inverting and storing a full covariance matrix. To circumvent this, mixtures of Gaussian process experts have been considered where data points are assigned to independent experts, reducing the complexity by allowing inference based on smaller, local covariance matrices. Moreover, mixtures of Gaussian process experts substantially enrich the model's flexibility, allowing for behaviors such as non-stationarity, heteroscedasticity, and discontinuities. In this work, we construct a novel inference approach based on nested sequential Monte Carlo samplers to simultaneously infer both the gating network and Gaussian process expert parameters. This greatly improves inference compared to importance sampling, particularly in settings when a stationary Gaussian process is inappropriate, while still being thoroughly parallelizable.
翻译:高斯进程是许多灵活的统计和机器学习模型的关键组成部分。 但是,由于需要翻转和储存一个完整的共变矩阵,这些进程呈现了三次计算的复杂性和高度的内存限制。 为绕过这一点,在指定独立专家的数据点时考虑了高斯进程专家的混合物,允许根据较小的本地共变矩阵进行推论,从而降低复杂性。此外,高斯进程专家的混合物大大丰富了模型的灵活性,允许非常态、超常性和不连续性等行为。在这项工作中,我们根据嵌套的连续的蒙特卡洛采样器设计了一种新的推论方法,同时推算了坐标网和高斯进程专家参数。这大大改善了与重要取样的推论,特别是在固定的戈斯进程不合适的情况下,这仍然可以完全平行。