Entropy and information can be considered dual: entropy is a measure of the subspace defined by the information constraining the given ambient space. Negative entropies, arising in na\"ive extensions of the definition of entropy from discrete to continuous settings, are byproducts of the use of probabilities, which only work in the discrete case by a fortunate coincidence. We introduce notions such as sup-normalization and information measures, which allow for the appropriate generalization of the definition of entropy that keeps with the interpretation of entropy as a subspace volume. Applying this in the context of topological groups and Haar measures, we elucidate the relationship between entropy, symmetry, and uniformity.
翻译:信封和信息可被视为双重: 英特罗比是限制给定环境空间的信息所定义的子空间的量度。 在从离散环境向连续环境扩展的英特罗比定义中产生的负寄生虫,是使用概率的副产物,只有在离散情况下,幸运的巧合才起作用。我们引入了一些概念,如超常和信息计量等,允许适当笼统地解释与对英特罗比作为子空间体积的解释相一致的酶定义。在表层组和海尔测量中应用这一点,我们阐述了英特罗比、对称和统一之间的关系。