Around the mean dimensions and rate-distortion functions, using some tools from local entropy theory this paper establishes the following main results: $(1)$ We prove that for non-ergodic measures associated with almost sure processes, the mean R\'enyi information dimension coincides with the information dimension rate. This answers a question posed by Gutman and \'Spiewak (in Around the variational principle for metric mean dimension, \emph{Studia Math.} \textbf{261}(2021) 345-360). $(2)$ We introduce four types of rate-distortion entropies and establish their relation with Kolmogorov-Sinai entropy. $(3)$ We show that for systems with the marker property, if the mean dimension is finite, then the supremum in Lindenstrauss-Tsukamoto's double variational principle can be taken over the set of ergodic measures. Additionally, the double variational principle holds for various other measure-theoretic $\epsilon$-entropies.
翻译:暂无翻译