The bootstrap is a widely used procedure for statistical inference because of its simplicity and attractive statistical properties. However, the vanilla version of bootstrap is no longer feasible computationally for many modern massive datasets due to the need to repeatedly resample the entire data. Therefore, several improvements to the bootstrap method have been made in recent years, which assess the quality of estimators by subsampling the full dataset before resampling the subsamples. Naturally, the performance of these modern subsampling methods is influenced by tuning parameters such as the size of subsamples, the number of subsamples, and the number of resamples per subsample. In this paper, we develop a novel hyperparameter selection methodology for selecting these tuning parameters. Formulated as an optimization problem to find the optimal value of some measure of accuracy of an estimator subject to computational cost, our framework provides closed-form solutions for the optimal hyperparameter values for subsampled bootstrap, subsampled double bootstrap and bag of little bootstraps, at no or little extra time cost. Using the mean square errors as a proxy of the accuracy measure, we apply our methodology to study, compare and improve the performance of these modern versions of bootstrap developed for massive data through simulation study. The results are promising.
翻译:靴子陷阱是一种广泛使用的统计推断程序,因为其简单和有吸引力的统计性质。然而,由于需要反复重塑全部数据,因此,对于许多现代大规模数据集,香草版的靴子陷阱不再可行,因为需要反复重塑整个数据。因此,近年来对靴子陷阱方法作了一些改进,在重新取样子样本之前,通过子抽样对全部数据集进行分抽样,评估估算器的质量。自然,这些现代次抽样方法的性能受到调试参数的影响,例如子样本的大小、子样本的数量和每个子样本的样本数量等。在本文中,我们为选择这些调整参数制定了新的超参数选择方法。作为一种优化问题,在重新采样子样本之前,我们的框架为这些测试器的最佳超标准值提供了封闭式的解决方案,例如子样本的尺寸、二号靴子陷阱和小靴壳的尺寸等调试参数,以及每个子样本的数量等。在不增加或很少的时间里,我们为选择这些调整参数而采用新的超标准选择方法。我们用这些模拟模型的模拟结果来改进模型的精确度,我们作为模拟模型的模拟模型的模型的模拟方法。我们用一个普通模型的精确性能的模型研究。我们用这些模型的精确性能的模型的精确度的模型进行比较。