Bootstrapping has been a primary tool for uncertainty quantification, and its theoretical and computational properties have been investigated in the field of statistics and machine learning. However, due to its nature of repetitive computations, the computational burden required to bootstrap neural networks is painfully heavy, and this fact seriously hurdles the practical use of these procedures on the uncertainty estimation of modern deep learning. To overcome this computational bottleneck, we propose a procedure called Neural Bootstrapper (NeuBoots) that constructs a generator of bootstrapped networks. Unlike the standard bootstrap, the proposed NeuBoots can be computed on a single loss function from a single training. It thus avoids repetitive training inherited in the standard bootstrap, which significantly improves the efficiency of the bootstrap computation. We theoretically show that the NeuBoots asymptotically approximates the standard bootstrap distribution, and our empirical examples also support this assertion. Consequently, we apply the NeuBoots to uncertainty quantification tasks in machine learning, and these include prediction calibrations, semantic segmentation tasks, detection of out-of-distribution samples, and active learning. Our empirical results show that the NeuBoots performs better than the state-of-the-art procedures in the uncertainty quantification, under a much less computational cost.
翻译:为克服这一计算瓶颈,我们提出了一个名为NeuBoots(NeuBoots)的程序,用于构建一个螺旋式网络。与标准靴子装置不同,拟议的NeuBoots可以用单项训练的单项损失函数来计算。因此,它避免了在标准靴子装置中遗留下来的重复培训,从而大大提高了制鞋装置的计算效率。我们理论上表明,NeuBoots(NeuBoots)几乎接近标准靴子分布,我们的经验实例也支持这一说法。因此,我们用NeuBoots(NeuBoots)来计算机器学习中的不确定性量化任务,其中包括预测校准、语义分解任务、检测分配样本外的测试以及积极的量化程序。我们在理论上表明,NeuBoots(NeuBoots)与标准篮子分布相近,我们的经验实验结果显示,在机体学习中,在计算中,比实际的计算过程要低得多。