This paper presents an efficient variational inference framework for deriving a family of structured gaussian process regression network (SGPRN) models. The key idea is to incorporate auxiliary inducing variables in latent functions and jointly treats both the distributions of the inducing variables and hyper-parameters as variational parameters. Then we propose structured variable distributions and marginalize latent variables, which enables the decomposability of a tractable variational lower bound and leads to stochastic optimization. Our inference approach is able to model data in which outputs do not share a common input set with a computational complexity independent of the size of the inputs and outputs and thus easily handle datasets with missing values. We illustrate the performance of our method on synthetic data and real datasets and show that our model generally provides better imputation results on missing data than the state-of-the-art. We also provide a visualization approach for time-varying correlation across outputs in electrocoticography data and those estimates provide insight to understand the neural population dynamics.
翻译:本文介绍了一个高效的变推法框架,用于形成一个结构化的粗鲁进程回归网络模型组。关键的想法是将辅助诱导变量纳入潜在功能中,并共同将诱导变量和超参数的分布作为变量参数处理。然后我们提出结构化变量分布,将潜伏变量边缘化,使可移动的低变差线可以分解,并导致随机优化。我们的推论方法能够模拟数据,其中产出不共享一个具有计算复杂性的通用输入集,而与输入和输出的大小无关,因此容易处理缺失值的数据集。我们举例说明了我们在合成数据和真实数据集方面的方法的性能,并表明我们的模型一般而言,在缺失数据上提供的估算结果优于状态。我们还为电子学数据产出之间的时间变化相关关系提供了可视化方法,这些估算提供了理解神经群动态的洞察力。