Accounting for the uncertainty in the predictions of modern neural networks is a challenging and important task in many domains. Existing algorithms for uncertainty estimation require modifying the model architecture and training procedure (e.g., Bayesian neural networks) or dramatically increase the computational cost of predictions such as approaches based on ensembling. This work proposes a new algorithm that can be applied to a given trained neural network and produces approximate prediction intervals. The method is based on the classical delta method in statistics but achieves computational efficiency by using matrix sketching to approximate the Jacobian matrix. The resulting algorithm is competitive with state-of-the-art approaches for constructing predictive intervals on various regression datasets from the UCI repository.
翻译:现代神经网络预测的不确定性的核算在许多领域是一项艰巨而重要的任务,现有的不确定性估算算法需要修改模型结构和培训程序(例如,贝叶西亚神经网络),或大幅度增加预测的计算成本,例如基于组合的方法。这项工作提出了一种新的算法,可适用于特定受过训练的神经网络,并产生大致的预测间隔。这种方法以典型的统计三角洲方法为基础,但通过使用矩阵图谱以接近雅各矩阵,实现了计算效率。由此产生的算法与最先进的方法具有竞争力,即对UCI存储处的各种回归数据集建立预测间隔。