Modern learning systems act on internal representations of data, yet how these representations encode underlying physical or statistical structure is often left implicit. In physics, conservation laws of Hamiltonian systems such as symplecticity guarantee long-term stability, and recent work has begun to hard-wire such constraints into learning models at the loss or output level. Here we ask a different question: what would it mean for the representation itself to obey a symplectic conservation law in the sense of Hamiltonian mechanics? We express this symplectic constraint through Legendre duality: the pairing between primal and dual parameters, which becomes the structure that the representation must preserve. We formalize Legendre dynamics as stochastic processes whose trajectories remain on Legendre graphs, so that the evolving primal-dual parameters stay Legendre dual. We show that this class includes linear time-invariant Gaussian process regression and Ornstein-Uhlenbeck dynamics. Geometrically, we prove that the maps that preserve all Legendre graphs are exactly symplectomorphisms of cotangent bundles of the form "cotangent lift of a base diffeomorphism followed by an exact fibre translation". Dynamically, this characterization leads to the design of a Symplectic Reservoir (SR), a reservoir-computing architecture that is a special case of recurrent neural network and whose recurrent core is generated by Hamiltonian systems that are at most linear in the momentum. Our main theorem shows that every SR update has this normal form and therefore transports Legendre graphs to Legendre graphs, preserving Legendre duality at each time step. Overall, SR implements a geometrically constrained, Legendre-preserving representation map, injecting symplectic geometry and Hamiltonian mechanics directly at the representational level.
翻译:现代学习系统作用于数据的内部表示,然而这些表示如何编码底层物理或统计结构往往未被显式说明。在物理学中,哈密顿系统的守恒律(如辛结构)保证了长期稳定性,近期研究已开始将此类约束硬编码到学习模型的损失或输出层面。在此我们提出一个不同的问题:若表示本身在哈密顿力学意义上遵守辛守恒律,其含义为何?我们通过勒让德对偶性表述这一辛约束:原始参数与对偶参数间的配对构成了表示必须保持的结构。我们将勒让德动力学形式化为随机过程,其轨迹保持在勒让德图上,使得演化的原始-对偶参数始终保持勒让德对偶关系。我们证明此类过程包括线性时不变高斯过程回归与奥恩斯坦-乌伦贝克动力学。在几何层面,我们证明保持所有勒让德图的映射恰好是余切丛的辛同胚,其形式为"基流形微分同胚的余切提升复合精确纤维平移"。在动力学层面,这一特征引出了辛格水库的设计——一种储层计算架构,作为循环神经网络的特例,其循环核心由动量至多为线性的哈密顿系统生成。我们的主要定理表明每个辛格水库更新都具有此规范形式,因而能将勒让德图映射至勒让德图,在每一步时间迭代中保持勒让德对偶性。总体而言,辛格水库实现了几何约束的勒让德保持表示映射,将辛几何与哈密顿力学直接注入表示层面。