We design a low complexity distributed compression scheme for computing arbitrary functions of sources with discrete alphabets. We use a helper-based method that extends the definition of the G{\'a}cs-K{\"o}rner-Witsenhausen (GKW) common information to functional common information. The helper relaxes the combinatorial structure of GKW by partitioning the joint source distribution into nests imposed by the function, which ensures hierarchical cooperation between the sources for effectively distributed computing. By contrasting our approach's performance with existing efficient techniques, we demonstrate the rate savings in recovering function and source data. Permutation invariant functions are prevalent in learning and combinatorial optimization fields and most recently applied to graph neural networks. We consider the efficient compression for computing permutation invariant functions in a network with two sources and one decoder. We use a bipartite graph representation, where two disjoint sets of vertices (parts) denote the individual source graphs and the edge weights capture the joint source distribution. We compress bipartite graphs by creating connected components determined by the function's distribution, accounting for the edge symmetries, and eliminating the low probability edges. We separately encode those edges and send them as refinements. Our approach can substitute high complexity joint decoding techniques and inform neural networks to reduce the computing time and reduce complexity.
翻译:我们设计了一个低复杂分布式压缩计划,用于计算使用离散字母的源的任意函数。 我们使用一种基于帮助者法的方法,将G ~'a}cs- Ks- Ks or}}rner- Witsenhausen (GKW) 通用信息的定义扩展为功能性共同信息。 帮助者将联合源分布分隔成该函数所强加的巢状,确保源之间有效分布计算之间的等级合作。 通过将我们的方法与现有高效技术进行比较,我们展示了恢复功能和源数据的节约率。 在学习和组合优化字段中,差异性变异函数普遍存在于学习和组合优化字段中,最近还应用于图形神经网络。 我们考虑在两个来源和一个解码的网络中计算变量变异函数的高效压缩。 我们使用双方图解析, 其中两组的垂直(部分) 表示单个源图和边缘重量不相匹配, 捕捉到联合源的分布。 我们通过创建相连接的精度优化图, 减少这些相连接的内径的内缘结构, 来消除这些高比的计算方法。