We study the problem of distinguishing between two symmetric probability distributions over $n$ bits by observing $k$ bits of a sample, subject to the constraint that all $k-1$-wise marginal distributions of the two distributions are identical to each other. Previous works of Bogdanov et al. and of Huang and Viola have established approximately tight results on the maximal statistical distance when $k$ is at most a small constant fraction of $n$ and Naor and Shamir gave a tight bound for all $k$ in the case of distinguishing with the OR function. In this work we provide sharp upper and lower bounds on the maximal statistical distance that holds for all $k$. Upper bounds on the statistical distance have typically been obtained by providing uniform low-degree polynomial approximations to certain higher-degree polynomials; the sharpness and wider applicability of our result stems from the construction of suitable non-uniform approximations.
翻译:我们研究如何通过观察样本中的千美元位数区分两个对称概率分布大于10美元位数的对称概率问题,但受两种分布的所有K-1美元边际分布是相同的制约。Bogdanov等人以及Huang和Viola先前的作品在最大统计距离上确定了大致紧凑的结果,因为当美元最多为美元的一个不变不变比例时,Naor和Shamir在区分OR函数时,对所有美元都作了紧紧紧的绑定。在这项工作中,我们为所有美元持有的最大统计距离提供了尖锐的上下限。统计距离的上下限通常是通过向某些较高度的多元度提供统一的低度多面近似值而获得的;我们结果的锐度和广泛适用性来自适当的非统一近似值的构建。