We present a parsimonious surrogate framework for learning high dimensional parametric maps from limited training data. The need for parametric surrogates arises in many applications that require repeated queries of complex computational models. These applications include such "outer-loop" problems as Bayesian inverse problems, optimal experimental design, and optimal design and control under uncertainty, as well as real time inference and control problems. Many high dimensional parametric mappings admit low dimensional structure, which can be exploited by mapping-informed reduced bases of the inputs and outputs. Exploiting this property, we develop a framework for learning low dimensional approximations of such maps by adaptively constructing ResNet approximations between reduced bases of their inputs and output. Motivated by recent approximation theory for ResNets as discretizations of control flows, we prove a universal approximation property of our proposed adaptive projected ResNet framework, which motivates a related iterative algorithm for the ResNet construction. This strategy represents a confluence of the approximation theory and the algorithm since both make use of sequentially minimizing flows. In numerical examples we show that these parsimonious, mapping-informed architectures are able to achieve remarkably high accuracy given few training data, making them a desirable surrogate strategy to be implemented for minimal computational investment in training data generation.
翻译:我们提出了一个从有限的培训数据中学习高维参数图的简单替代框架。在许多需要反复查询复杂计算模型的应用程序中,需要使用参数替代器。这些应用包括巴伊西亚逆向问题、最佳实验设计、在不确定情况下最佳设计和控制以及实时时间推算和控制等“外环”问题。许多高维参数绘图都采用了低维结构,可以通过对输入和产出基础进行知情绘图来加以利用。利用这一属性,我们开发了一个框架,通过适应性地构建其投入和产出基础的ResNet近似来学习这些地图的低维维近似。这些应用包括诸如巴伊西亚反问题、最佳实验设计、以及不确定性情况下的最佳设计和控制以及实时时间推导和控制问题等“外环”问题。我们证明,我们拟议的适应性预测ResNet框架具有普遍近似特性,这为ResNet的构建提供了相关的迭代算法。这一战略代表了近称理论和算法的共通性,因为两者都使用了按顺序最大限度地减少流量。在减少投入和产出之间,我们通过适应性地构建ResNet的ResNet近似近似近似近似近似近似近似近似近似近似近似近似近似近近近近近似近似近似近近近近近近似近似近近近近近近近似近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近似近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近似近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近近于算法方法,我们近近近近近近近于算方法,我们近近近近近近近近近近于算方法,我们近近于算法,我们近似