Configurable software systems are employed in many important application domains. Understanding the performance of the systems under all configurations is critical to prevent potential performance issues caused by misconfiguration. However, as the number of configurations can be prohibitively large, it is not possible to measure the system performance under all configurations. Thus, a common approach is to build a prediction model from a limited measurement data to predict the performance of all configurations as scalar values. However, it has been pointed out that there are different sources of uncertainty coming from the data collection or the modeling process, which can make the scalar predictions not certainly accurate. To address this problem, we propose a Bayesian deep learning based method, namely BDLPerf, that can incorporate uncertainty into the prediction model. BDLPerf can provide both scalar predictions for configurations' performance and the corresponding confidence intervals of these scalar predictions. We also develop a novel uncertainty calibration technique to ensure the reliability of the confidence intervals generated by a Bayesian prediction model. Finally, we suggest an efficient hyperparameter tuning technique so as to train the prediction model within a reasonable amount of time whilst achieving high accuracy. Our experimental results on 10 real-world systems show that BDLPerf achieves higher accuracy than existing approaches, in both scalar performance prediction and confidence interval estimation.
翻译:在许多重要的应用领域使用可配置的软件系统。 了解所有配置下的系统性能对于防止因配置错误造成的潜在性能问题至关重要。 但是,由于配置数量可能惊人,不可能在所有配置下测量系统性能。 因此,一个共同的做法是从有限的测量数据中建立一个预测模型,以预测作为标度值的所有配置的性能。 然而,我们还指出,数据收集或建模过程有不同的不确定性来源,这可以肯定使标度预测不准确。为了解决这一问题,我们建议采用巴耶斯深度学习法,即BDLPerf,这种方法可以将不确定性纳入预测模型。 BDLPerf可以提供配置性能的卡路预测,以及这些卡路里预测的相应信任间隔。 我们还开发了一种新的不确定性校准技术,以确保巴伊西亚预测模型产生的信任度间隔的可靠性。 最后,我们建议一种高效的超分度调整技术,以便培训BLP的深度学习方法,将不确定性纳入预测模型。 BLP 能够提供比我们现有精确度高的精确度,同时在现实时间范围内实现我们现有的精确度的精确度,同时显示我们现有的精确度的精确度。