This paper proposes a novel perspective on learning, positing it as the pursuit of dynamical invariants -- data combinations that remain constant or exhibit minimal change over time as a system evolves. This concept is underpinned by both informational and physical principles, rooted in the inherent properties of these invariants. Firstly, their stability makes them ideal for memorization and integration into associative networks, forming the basis of our knowledge structures. Secondly, the predictability of these stable invariants makes them valuable sources of usable energy, quantifiable as kTln2 per bit of accurately predicted information. This energy can be harnessed to explore new transformations, rendering learning systems energetically autonomous and increasingly effective. Such systems are driven to continuously seek new data invariants as energy sources. The paper further explores several meta-architectures of autonomous, self-propelled learning agents that utilize predictable information patterns as a source of usable energy.
翻译:暂无翻译