Entropy and information can be considered dual: entropy is a measure of the subspace defined by the information constraining the given ambient space. Negative entropies, arising in na\"ive extensions of the definition of entropy from discrete to continuous settings, are byproducts of the use of probabilities, which only work in the discrete case by a fortunate coincidence. We introduce the notions of sup-normalization and information measures, which allow for the appropriate generalization of the definition of entropy that keeps with the interpretation of entropy as a subspace volume. Applying this in the context of topological groups and Haar measures, we elucidate the relationship between entropy, symmetry, and uniformity.
翻译:信封和信息可被视为双重: 英特罗比是限制给定环境空间的信息所定义的子空间的量度。 在从离散环境向连续环境扩展的英特罗比定义中产生的负寄生虫,是使用概率的副产物,只有在离散情况下,幸运的巧合才起作用。 我们引入了超常和信息计量概念,允许对英特罗比的定义作适当的概括,与对英特罗比作为子空间体积的解释保持一致。 在表层组和海尔测量中应用这一点,我们阐述了英特罗比、对称和统一之间的关系。</s>