We calculate and analyze various entropy measures and their properties for selected probability distributions. The entropies considered include Shannon, R\'enyi, generalized R\'enyi, Tsallis, Sharma-Mittal, and modified Shannon entropy, along with the Kullback-Leibler divergence. These measures are examined for several distributions, including gamma, chi-squared, exponential, Laplace, and log-normal distributions. We investigate the dependence of the entropy on the parameters of the respective distribution. We also study the convergence of Shannon entropy for certain probability distributions. Furthermore, we identify the extreme values of Shannon entropy for Gaussian vectors.
翻译:暂无翻译