G\'acs' coarse-grained algorithmic entropy leverages universal computation to quantify the information content of any given state. Unlike the Boltzmann and Shannon-Gibbs entropies, it requires no prior commitment to a partition of phase space or to probabilistic ensembles. Whereas earlier work had made loose connections between the entropy of thermodynamic systems and information-processing systems, the algorithmic entropy formally unifies them both. For a close variant of G\'acs' definition, we prove a very general second law of thermodynamics, and discuss its advantages over previous formulations. Our law is a general property of Markov processes, which can be derived as coarse-grainings of certain time-reversible dynamical systems. Finally, taking inspiration from Maxwell's demon, we model an information engine powered by compressible data.
翻译:暂无翻译