It is challenging to guide neural network (NN) learning with prior knowledge. In contrast, many known properties, such as spatial smoothness or seasonality, are straightforward to model by choosing an appropriate kernel in a Gaussian process (GP). Many deep learning applications could be enhanced by modeling such known properties. For example, convolutional neural networks (CNNs) are frequently used in remote sensing, which is subject to strong seasonal effects. We propose to blend the strengths of deep learning and the clear modeling capabilities of GPs by using a composite kernel that combines a kernel implicitly defined by a neural network with a second kernel function chosen to model known properties (e.g., seasonality). We implement this idea by combining a deep network and an efficient mapping based on the Nystrom approximation, which we call Implicit Composite Kernel (ICK). We then adopt a sample-then-optimize approach to approximate the full GP posterior distribution. We demonstrate that ICK has superior performance and flexibility on both synthetic and real-world data sets. We believe that ICK framework can be used to include prior information into neural networks in many applications.
翻译:引导神经网络(NN)学习具有挑战性。相反,许多已知的特性,如空间平滑或季节性等,通过在高山进程中选择一个合适的内核(GP),可以直接模型化。许多深层学习应用可以通过建模这类已知特性而得到加强。例如,在遥感中经常使用进化神经网络(CNNs),这种网络受到强烈季节效应的影响。我们提议将深层学习的长处和GPs清晰的模型能力结合起来,使用一个复合内核,将神经网络隐含定义的内核结合起来,而神经网络则选择了第二个内核功能来模拟已知特性(例如季节性)。我们实施这一想法的方法是将深层网络和基于Nystrometurity的高效绘图结合起来,我们称之为Nystremimcomeite Kernel(ICK),我们随后采用了一个抽样和最佳化的方法来估计GPposior分布的全部情况。我们证明,ICC在合成和真实世界数据集中都有较高的性能和灵活性。我们认为,ICCF框架可以将许多先前的信息纳入神经网络。