We study the two-party communication complexity of functions with large outputs, and show that the communication complexity can greatly vary depending on what output model is considered. We study a variety of output models, ranging from the open model, in which an external observer can compute the outcome, to the XOR model, in which the outcome of the protocol should be the bitwise XOR of the players' local outputs. This model is inspired by XOR games, which are widely studied two-player quantum games. We focus on the question of error-reduction in these new output models. For functions of output size k, applying standard error reduction techniques in the XOR model would introduce an additional cost linear in k. We show that no dependency on k is necessary. Similarly, standard randomness removal techniques, incur a multiplicative cost of $2^k$ in the XOR model. We show how to reduce this factor to O(k). In addition, we prove analogous error reduction and randomness removal results in the other models, separate all models from each other, and show that some natural problems, including Set Intersection and Find the First Difference, separate the models when the Hamming weights of their inputs is bounded. Finally, we show how to use the rank lower bound technique for our weak output models.
翻译:本文研究具有大输出的函数的两方通信复杂度,并展示了不同输出模型会导致通信复杂度的巨大差异。我们研究了一系列输出模型,包括 open model(开放模型),其中外部观察者可以计算结果;XOR model(异或模型),其中,协议的输出应该是参与者本地输出的按位异或;此模型得到XOR games(广泛研究的两人量子博弈)的启发。我们关注这些新输出模型中的误差降低问题。对于输出大小为 k 的函数,在 XOR model 中应用标准误差降低技术,会引入一个线性关于 k 的额外成本。我们证明这并非必须的。类似地,标准随机性去除技术,在 XOR model 中会导致一个 $2^k$ 的乘法成本。我们展示了如何将这个因子降低到 O(k)。另外,我们还证明了其他模型中类似的误差降低和随机性去除结果,并将所有模型都区分开来。当输入的 Hamming weights(哈明权重)受到限制时,我们展示了一些自然的问题,包括 Set Intersection 和 Find the First Difference,使得这些模型可以被区分。最后,我们展示了如何将秩的下界技巧应用于我们的弱输出模型。