Laws of large numbers establish asymptotic guarantees for recovering features of a probability distribution using independent samples. We introduce a framework for proving analogous results for recovery of the $\sigma$-field of a probability space, interpreted as information resolution--the granularity of measurable events given by comparison to our samples. Our main results show that, under iid sampling, the Borel $\sigma$-field in $\mathbb R^d$ and in more general metric spaces can be recovered in the strongest possible mode of convergence. We also derive finite-sample $L^1$ bounds for uniform convergence of $\sigma$-fields on $[0,1]^d$. We illustrate the use of our framework with two applications: constructing randomized solutions to the Skorokhod embedding problem, and analyzing the loss of variants of random forests for regression.
翻译:暂无翻译