It was recently shown that the lossless compression of a single source $X^n$ is achievable with a notion of strong locality; any $X_i$ can be decoded from a constant number of compressed bits, with a vanishing in $n$ probability of error. By contrast, we show that for two separately encoded sources $(X^n,Y^n)$, lossless compression and strong locality is generally not possible. Specifically, we show that for the class of ``confusable'' sources, strong locality cannot be achieved whenever one of the sources is compressed below its entropy. Irrespective of $n$, for some index $i$ the probability of error of decoding $(X_i,Y_i)$ is lower bounded by $2^{-O(d)}$, where $d$ denotes the number of compressed bits accessed by the local decoder. Conversely, if the source is not confusable, strong locality is possible even if one of the sources is compressed below its entropy. Results extend to an arbitrary number of sources.
翻译:最近显示,对单一源的无损压缩是能够实现的;任何X美元美元美元都可以从一个不变的压缩比特中解码,误差概率以美元消失。相反,我们显示,对于两个单独的编码来源$(Xn,Y ⁇ n),一般不可能实现无损压缩和差幅强。具体地说,我们表明,对于“可变源”类别,当一个来源压缩在其诱变值以下时,不可能实现强值。对于某些指数(X_i,Y_i),误差概率为$0,而对于某些指数(X_i,Y_i),误差概率为$2 ⁇ -O(d)}美元,其中美元表示本地解算器访问的压缩比特数。相反,如果来源不可靠,即使一个来源压缩在其诱变值之下,也有可能实现强值。结果扩大到任意的源数。