Ensembling can improve the performance of Neural Networks, but existing approaches struggle when the architecture likelihood surface has dispersed, narrow peaks. Furthermore, existing methods construct equally weighted ensembles, and this is likely to be vulnerable to the failure modes of the weaker architectures. By viewing ensembling as approximately marginalising over architectures we construct ensembles using the tools of Bayesian Quadrature -- tools which are well suited to the exploration of likelihood surfaces with dispersed, narrow peaks. Additionally, the resulting ensembles consist of architectures weighted commensurate with their performance. We show empirically -- in terms of test likelihood, accuracy, and expected calibration error -- that our method outperforms state-of-the-art baselines, and verify via ablation studies that its components do so independently.
翻译:集成可以改善神经网络的性能,但是当建筑表面可能分散、缩小的峰值时,现有的方法会挣扎。此外,现有的方法会构建同样加权的集合,这很可能会受到较弱结构的失败模式的影响。通过将组合看成大约在建筑上处于边缘地位,我们用巴耶斯二次曲线工具来构建组合,这些工具非常适合以分散的、狭窄的峰值来探索可能表面。此外,由此产生的组合由与其性能相称的建筑组成。我们在测试可能性、准确性和预期的校准错误方面从经验上表明,我们的方法超越了最先进的基线,并通过升级研究来核实其组成部分是否独立。</s>