In 1952, von Neumann gave a series of groundbreaking lectures that proved it was possible for circuits consisting of 3-input majority gates that have a sufficiently small independent probability $\delta > 0$ of malfunctioning to reliably compute Boolean functions. In 1999, Evans and Schulman used a strong data-processing inequality (SDPI) to establish the tightest known necessary condition $\delta < \frac{1}{2} - \frac{1}{2\sqrt{k}}$ for reliable computation when the circuit consists of components that have at most $k$ inputs. In 2017, Polyanskiy and Wu distilled Evans and Schulman's SDPI argument to establish a general result on the contraction of mutual information in Bayesian networks. In this essay, we will first introduce the problem of reliable computation from unreliable components and establish the existence of noise thresholds. We will then provide an exposition of von Neumann's result with 3-input majority gates and extend it to minority gates. We will then provide an introduction to SDPIs, which have many applications, including in statistical mechanics, portfolio theory, and lower bounds on statistical estimation under privacy constraints. We will then use the introduced material to provide an exposition of Polyanskiy and Wu's 2017 result on Bayesian networks, from which the 1999 result of Evans-Schulman follows.
翻译:暂无翻译