We design a distributed function-aware quantization scheme for distributed functional compression. We consider $2$ correlated sources $X_1$ and $X_2$ and a destination that seeks an estimate $\hat{f}$ for the outcome of a continuous function $f(X_1,\,X_2)$. We develop a compression scheme called hyper binning in order to quantize $f$ via minimizing the entropy of joint source partitioning. Hyper binning is a natural generalization of Cover's random code construction for the asymptotically optimal Slepian-Wolf encoding scheme that makes use of orthogonal binning. The key idea behind this approach is to use linear discriminant analysis in order to characterize different source feature combinations. This scheme captures the correlation between the sources and the function's structure as a means of dimensionality reduction. We investigate the performance of hyper binning for different source distributions and identify which classes of sources entail more partitioning to achieve better function approximation. Our approach brings an information theory perspective to the traditional vector quantization technique from signal processing.
翻译:我们为分布式功能压缩设计了一个分布式函数观测量化方案。 我们考虑的是2美元相关源 $X_1美元和$X_2美元, 以及一个目的, 以寻找一个连续函数结果$f(X_1,\\,X_2美元)的估算值 $hat{f} $f。 我们开发了一个叫做超双宾器的压缩方案, 以便通过最小化联合源分区的辛酸来量化美元。 超双宾器是一种自然的概括化, 用于使用 orthogonal binning 最佳 Slepian- Wolf 编码方案的封面随机代码构建。 这种方法的关键理念是使用线性差异分析来描述不同源特性组合。 这个方案抓住了源和函数结构之间的关联性, 以此来减少维度。 我们调查不同源分布的超宾器的性功能, 并找出哪些源需要更多分割来达到更好的功能近似。 我们的方法将信息理论观点引入信号处理中的传统矢量分析技术。