Ensembling can improve the performance of Neural Networks, but existing approaches struggle when the architecture likelihood surface has dispersed, narrow peaks. Furthermore, existing methods construct equally weighted ensembles, and this is likely to be vulnerable to the failure modes of the weaker architectures. By viewing ensembling as approximately marginalising over architectures we construct ensembles using the tools of Bayesian Quadrature -- tools which are well suited to the exploration of likelihood surfaces with dispersed, narrow peaks. Additionally, the resulting ensembles consist of architectures weighted commensurate with their performance. We show empirically -- in terms of test likelihood, accuracy, and expected calibration error -- that our method outperforms state-of-the-art baselines, and verify via ablation studies that its components do so independently.
翻译:集成可以改善神经网络的性能,但现有的方法在架构可能具有分散、窄峰的概率表面时表现不佳。此外,现有方法构建等权重集成,并且这可能容易受到较弱架构的失效模式的影响。通过将集成视为近似地求解架构边缘化问题,我们使用贝叶斯积分工具构建集成——这些工具非常适用于探索具有分散、窄峰的概率表面。此外,最终的集成由与其性能相符的架构权重组成。我们通过测试似然、准确性和预期校准误差来展示经验,并进行消融研究以独立地验证其构成部分。