The concept of linearity plays a central role in both mathematics and computer science, with distinct yet complementary meanings. In mathematics, linearity underpins functions and vector spaces, forming the foundation of linear algebra and functional analysis. In computer science, it relates to resource-sensitive computation. Linear Logic (LL), for instance, models assumptions that must be used exactly once, providing a natural framework for tracking computational resources such as time, memory, or data access. This dual perspective makes linearity essential to programming languages, type systems, and formal models that express both computational complexity and composability. Bridging these interpretations enables rigorous yet practical methodologies for analyzing and verifying complex systems. This thesis explores the use of LL to model programming paradigms based on linearity. It comprises two parts: ADLL and CryptoBLL. The former applies LL to Automatic Differentiation (AD), modeling linear functions over the reals and the transposition operation. The latter uses LL to express complexity constraints on adversaries in computational cryptography. In AD, two main approaches use linear type systems: a theoretical one grounded in proof theory, and a practical one implemented in JAX, a Python library developed by Google for machine learning research. In contrast, frameworks like PyTorch and TensorFlow support AD without linear types. ADLL aims to bridge theory and practice by connecting JAX's type system to LL. In modern cryptography, several calculi aim to model cryptographic proofs within the computational paradigm. These efforts face a trade-off between expressiveness, to capture reductions, and simplicity, to abstract probability and complexity. CryptoBLL addresses this tension by proposing a framework for the automatic analysis of protocols in computational cryptography.
翻译:线性概念在数学与计算机科学中均占据核心地位,其含义虽不同却互为补充。在数学中,线性性是函数与向量空间的基础,构成了线性代数与泛函分析的基石。在计算机科学中,线性性与资源敏感计算相关。例如,线性逻辑(LL)对必须恰好使用一次的假设进行建模,为追踪时间、内存或数据访问等计算资源提供了自然框架。这种双重视角使得线性性对于表达计算复杂性与可组合性的编程语言、类型系统和形式化模型至关重要。桥接这两种解释可为分析和验证复杂系统提供严谨而实用的方法论。本论文探讨了使用LL对基于线性性的编程范式进行建模。论文包含两部分:ADLL与CryptoBLL。前者将LL应用于自动微分(AD),对实数上的线性函数及转置操作进行建模;后者使用LL表达计算密码学中对手的复杂性约束。在AD领域,两种主要方法采用了线性类型系统:一种基于证明理论的理论方法,另一种是实践方法,实现在JAX(谷歌为机器学习研究开发的Python库)中。相比之下,PyTorch和TensorFlow等框架支持无线性类型的AD。ADLL旨在通过将JAX的类型系统与LL连接起来,以弥合理论与实践之间的鸿沟。在现代密码学中,多种演算试图在计算范式内建模密码学证明。这些努力面临着表达力(以捕捉规约)与简洁性(以抽象概率和复杂性)之间的权衡。CryptoBLL通过提出一个用于自动分析计算密码学协议的框架,以应对这一张力。