Deep learning is moving towards increasingly sophisticated optimization objectives that employ higher-order functions, such as integration, continuous optimization, and root-finding. Since differentiable programming frameworks such as PyTorch and TensorFlow do not have first-class representations of these functions, developers must reason about the semantics of such objectives and manually translate them to differentiable code. We present a differentiable programming language, $\lambda_S$, that is the first to deliver a semantics for higher-order functions, higher-order derivatives, and Lipschitz but nondifferentiable functions. Together, these features enable $\lambda_S$ to expose differentiable, higher-order functions for integration, optimization, and root-finding as first-class functions with automatically computed derivatives. $\lambda_S$'s semantics is computable, meaning that values can be computed to arbitrary precision, and we implement $\lambda_S$ as an embedded language in Haskell. We use $\lambda_S$ to construct novel differentiable libraries for representing probability distributions, implicit surfaces, and generalized parametric surfaces -- all as instances of higher-order datatypes -- and present case studies that rely on computing the derivatives of these higher-order functions and datatypes. In addition to modeling existing differentiable algorithms, such as a differentiable ray tracer for implicit surfaces, without requiring any user-level differentiation code, we demonstrate new differentiable algorithms, such as the Hausdorff distance of generalized parametric surfaces.
翻译:深层学习正在向日益复杂的优化目标迈进, 使用更高级的功能, 如整合、 连续优化和根调查。 由于 PyTorrch 和 TensorFlow 等不同的编程框架没有这些功能的一流表示, 开发者必须解释这些目标的语义, 并手工将其转换为不同的代码。 我们展示了一种不同的编程语言, $\lambda_ S$, 这是第一个为更高级的函数、 更高级的地平级衍生品 和 Lipschitz 但不可区分的函数提供语义的语义。 这些功能使 $\ lambda_ S$ 能够将整合、 优化、 和 根调查功能作为一流的、 自动计算衍生品。 $\ lambda_ S$ 的语义可以被任意精确计算, 而我们用 $\lambda_ S$ 作为可嵌入的语言。 我们用 $\lambda_ S$ 来构建新的更高层次的图书馆, 代表概率分布式的、 隐含的地平面和直径解数据模型, 各种的图表的直径直观数据 。