Research in Machine Learning has polarized into two general regression approaches: Transductive methods derive estimates directly from available data but are usually problem unspecific. Inductive methods can be much more particular, but generally require tuning and compute-intensive searches for solutions. In this work, we adopt a hybrid approach: We leverage the theory of Reproducing Kernel Banach Spaces (RKBS) and show that transductive principles can be induced through gradient descent to form efficient \textit{in-context} neural approximators. We apply this approach to RKBS of function-valued operators and show that once trained, our \textit{Transducer} model can capture on-the-fly relationships between infinite-dimensional input and output functions, given a few example pairs, and return new function estimates. We demonstrate the benefit of our transductive approach to model complex physical systems influenced by varying external factors with little data at a fraction of the usual deep learning training computation cost for partial differential equations and climate modeling applications.
翻译:“机器学习”的研究已两极分化为两种一般回归法:转换方法直接从现有数据中得出估计数,但通常不具有特定问题。感应方法可能更为特殊,但通常需要调整和计算密集的解决方案搜索。在这项工作中,我们采用混合方法:我们利用复制Kernel Banach空间(RKBS)的理论,并表明通过梯度下降可以引导转原则形成高效的神经控制器。我们将这种方法应用到功能估值操作员的RKBS, 并表明一旦经过培训, 我们的\ textit{ Transducer} 模型就可以捕捉到无限输入和输出功能之间的在飞行关系, 并返回新的函数估计。我们展示了我们通过受不同外部因素影响而模型复杂的物理系统的转换方法的好处,这些外在通常的深层次培训成本中,只有一小部分数据用于部分差异方程式和气候模型应用。