Shannon entropy is widely used for quantifying uncertainty in discrete random variables. But when normalized to the unit interval, as is often done in practice, it fails to convey the alphabet size of the random variable under study. This work introduces an entropy functional based on Jensen-Shannon divergence that is naturally bounded from above by one. Unlike normalized Shannon entropy, this new functional is strictly increasing in alphabet size under uniformity and is thus well suited to the characterization of discrete random variables.
翻译:Shannon entropy 被广泛用于量化离散随机变量中的不确定性。 但是,在与单位间隔正常化时,它无法传达所研究的随机变量的字母大小。 这项工作引入了基于Jensen-Shannon差异的环形功能,该差异自然地与上面的方位相连接。 与正常的 Shannon entropy 不同的是,这一新功能在统一的情况下严格地增加了字母大小,因此非常适合离散随机变量的特性。