G\'acs' coarse-grained algorithmic entropy leverages universal computation to quantify the information content of any given physical state. Unlike the Boltzmann and Shannon-Gibbs entropies, it requires no prior commitment to macrovariables or probabilistic ensembles. Whereas earlier work had made loose connections between the entropy of thermodynamic systems and information-processing systems, the algorithmic entropy formally unifies them both. After adapting G\'acs' definition to Markov processes, we prove a very general second law of thermodynamics, and discuss its advantages over previous formulations. Finally, taking inspiration from Maxwell's demon, we model an information engine powered by compressible data.
翻译:暂无翻译