Automated Machine Learning with ensembling (or AutoML with ensembling) seeks to automatically build ensembles of Deep Neural Networks (DNNs) to achieve qualitative predictions. Ensemble of DNNs are well known to avoid over-fitting but they are memory and time consuming approaches. Therefore, an ideal AutoML would produce in one single run time different ensembles regarding accuracy and inference speed. While previous works on AutoML focus to search for the best model to maximize its generalization ability, we rather propose a new AutoML to build a larger library of accurate and diverse individual models to then construct ensembles. First, our extensive benchmarks show asynchronous Hyperband is an efficient and robust way to build a large number of diverse models to combine them. Then, a new ensemble selection method based on a multi-objective greedy algorithm is proposed to generate accurate ensembles by controlling their computing cost. Finally, we propose a novel algorithm to optimize the inference of the DNNs ensemble in a GPU cluster based on allocation optimization. The produced AutoML with ensemble method shows robust results on two datasets using efficiently GPU clusters during both the training phase and the inference phase.
翻译:自动机器学习( 或自动学习( 集合) 寻求自动构建深神经网络( DNNS) 的集合群, 以自动建立深神经网络( DNNS) 的集合群, 以便实现定性预测。 众所周知, DNNS的集合群可以避免过度安装, 但它们是记忆和耗时的方法。 因此, 理想的 AutML 将在一个单一的时间里产生关于准确性和推断速度的不同集合群集。 在以前关于 Automal 的工程中, 寻找最佳模型, 以最大限度地扩大其概括能力, 我们建议一个新的 AutML, 以建立一个由准确和多样的个体模型组成的大库, 以构建聚合群集。 首先, 我们的广泛基准显示, 无同步的超音频带是一种高效和稳健的方法, 以构建大量不同的模型, 以合并这些模型。 然后, 在一个多目标的贪婪算法的基础上, 提出一个新的组合选择方法, 以控制其计算成本来产生准确的集合体。 最后, 我们提出一个新的算法, 以优化地优化地将 DNNS 组合在两个基于配置的组合组组的推算结果。