Modern computer hardware supports low- and mixed-precision arithmetic for enhanced computational efficiency. In practical predictive modeling, however, it becomes vital to quantify the uncertainty due to rounding along with other sources of uncertainty (such as measurement, sampling, and numerical discretization) to ensure efficiency gains do not compromise accuracy. Higham and Mary [1] showed that modeling rounding errors as zero-mean independent random variables yields a problem size-dependent constant, $\tilde{\gamma}_n \propto \sqrt{n}$, which scales more slowly than in traditional deterministic analysis. We propose a novel variance-informed probabilistic rounding error analysis, modeling rounding errors as bounded, independent, and identically distributed (i.i.d.) random variables. This yields a new constant $\hat{\gamma}_n$, dependent on the mean, variance, and bounds of the rounding error distribution. We rigorously show that $\hat{\gamma}_n \propto \sqrt{n}$ using statistical properties of rounding errors, without ad-hoc assumptions, as in Higham and Mary. This new constant increases gradually with problem size and can improve the rounding error estimates for large arithmetic operations performed at low precision by up to six orders of magnitude. We conduct numerical experiments on random vector dot products, matrix-vector multiplication, a linear system solution, and a stochastic boundary value problem. We show that quantifying rounding uncertainty along with traditional sources (numerical discretization, sampling, parameters) enables a more efficient allocation of computational resources, thereby balancing computational efficiency with predictive accuracy. This study is a step towards a comprehensive mixed-precision approach that improves model reliability and enables budgeting of computational resources in predictive modeling and decision-making.
翻译:暂无翻译