We prove that the sum of $t$ boolean-valued random variables sampled by a random walk on a regular expander converges in total variation distance to a discrete normal distribution at a rate of $O(\lambda/t^{1/2-o(1)})$, where $\lambda$ is the second largest eigenvalue of the random walk matrix in absolute value. To the best of our knowledge, among known Berry-Esseen bounds for Markov chains, our result is the first to show convergence in total variation distance, and is also the first to incorporate a linear dependence on expansion $\lambda$. In contrast, prior Markov chain Berry-Esseen bounds showed a convergence rate of $O(1/\sqrt{t})$ in weaker metrics such as Kolmogorov distance. Our result also improves upon prior work in the pseudorandomness literature, which showed that the total variation distance is $O(\lambda)$ when the approximating distribution is taken to be a binomial distribution. We achieve the faster $O(\lambda/t^{1/2-o(1)})$ convergence rate by generalizing the binomial distribution to discrete normals of arbitrary variance. We specifically construct discrete normals using a random walk on an appropriate 2-state Markov chain. Our bound can therefore be viewed as a regularity lemma that reduces the study of arbitrary expanders to a small class of particularly simple expanders.
翻译:我们证明,通过在常规扩张器上随机漫步抽样的以布林亚价值为价的随机变数总和,与离散正常分布的全变差距离相交,总变差幅度为$O(lambda/t ⁇ 1/2-o(1)})$(lambda$)是随机行走矩阵中以绝对值表示的第二大原始价值。据我们所知,在已知的马尔科夫链条Berry-Esseen界限中,我们的结果是第一个显示总变差距离一致,也是第一个包含对扩张的线性依赖$\lambda$(lambda$)的。相比之下,以前的Markov链 Berry-Eseen 边框在Kolmogorov 距离等较弱的计量中显示的是美元(1/\sqrt{t}的趋同值。我们的结果也是比普通流离散分布法更快的($O/2/blam{t}我们通过直流流流流流流流速度将正常的递率范围扩展到我们一般的递度比例。