This work introduces the Efficient Transformed Gaussian Process (ETGP), a new way of creating C stochastic processes characterized by: 1) the C processes are non-stationary, 2) the C processes are dependent by construction without needing a mixing matrix, 3) training and making predictions is very efficient since the number of Gaussian Processes (GP) operations (e.g. inverting the inducing point's covariance matrix) do not depend on the number of processes. This makes the ETGP particularly suited for multi-class problems with a very large number of classes, which are the problems studied in this work. ETGPs exploit the recently proposed Transformed Gaussian Process (TGP), a stochastic process specified by transforming a Gaussian Process using an invertible transformation. However, unlike TGPs, ETGPs are constructed by transforming a single sample from a GP using C invertible transformations. We derive an efficient sparse variational inference algorithm for the proposed model and demonstrate its utility in 5 classification tasks which include low/medium/large datasets and a different number of classes, ranging from just a few to hundreds. Our results show that ETGPs, in general, outperform state-of-the-art methods for multi-class classification based on GPs, and have a lower computational cost (around one order of magnitude smaller).
翻译:这项工作引入了高效转换高斯进程(ETGP),这是创建高斯进程的一种新方式,其特点是:(1) C进程不是静止的,(2) C进程依赖建筑,不需要混合矩阵,(3) 培训和预测非常高效,因为高斯进程(GP)业务的数量(例如,颠倒引点的共差矩阵)并不取决于流程的数量。这使得ETGP特别适合多级多级问题的多级问题,这是这项工作中研究的问题。ETGP利用了最近提议的变换高斯进程(TGP),这是用不可逆的变换改变高斯进程(TGP)所指定的一个随机进程。然而,与GP(GP)不同的是,ETGP的构建方式是利用C的可逆变变变变变变方式将一个样本从GP转换成一个样本。我们为拟议模型的低/中/大型变异性算法,并展示其5个分类任务中的实用性,其中包括低/中/大数据配置和不同数目的GOGP等级,从一个普通的分类到一个不同的等级,从一个普通的等级,从一个到一个等级,从一个到一个等级,从一个等级,从一个等级,从一个等级到一个等级的一个到一个等级,从一个等级,从一个等级到一个等级的一个,从一个等级不等。