Bootstrapping has been a primary tool for ensemble and uncertainty quantification in machine learning and statistics. However, due to its nature of multiple training and resampling, bootstrapping deep neural networks is computationally burdensome; hence it has difficulties in practical application to the uncertainty estimation and related tasks. To overcome this computational bottleneck, we propose a novel approach called \emph{Neural Bootstrapper} (NeuBoots), which learns to generate bootstrapped neural networks through single model training. NeuBoots injects the bootstrap weights into the high-level feature layers of the backbone network and outputs the bootstrapped predictions of the target, without additional parameters and the repetitive computations from scratch. We apply NeuBoots to various machine learning tasks related to uncertainty quantification, including prediction calibrations in image classification and semantic segmentation, active learning, and detection of out-of-distribution samples. Our empirical results show that NeuBoots outperforms other bagging based methods under a much lower computational cost without losing the validity of bootstrapping.
翻译:布尔特拉普是机器学习和统计中混合和不确定量化的主要工具,然而,由于其多重培训和再抽样的性质,靴子深神经网络在计算上繁琐不堪;因此难以实际应用不确定性估计和相关任务。为了克服这一计算瓶颈,我们提议采用名为“NeuBoots”的新颖方法,通过单一模型培训,学会生成靴子式神经网络。NeuBoots将靴子套式重量注入主干网的高层次特征层,并输出对目标的靴式预测,而没有额外的参数和从零开始的重复计算。我们应用NeuBoots来执行与不确定性量化有关的各种机器学习任务,包括图像分类和语义分解的预测校准、积极学习和分解样本的探测。我们的经验结果表明,NeuBoots在不失去靴子有效性的情况下,在低得多的计算成本下,超越了其他基于包状的方法。