The evaluation of hyperparameters, neural architectures, or data augmentation policies becomes a critical model selection problem in advanced deep learning with a large hyperparameter search space. In this paper, we propose an efficient and robust bandit-based algorithm called Sub-Sampling (SS) in the scenario of hyperparameter search evaluation. It evaluates the potential of hyperparameters by the sub-samples of observations and is theoretically proved to be optimal under the criterion of cumulative regret. We further combine SS with Bayesian Optimization and develop a novel hyperparameter optimization algorithm called BOSS. Empirical studies validate our theoretical arguments of SS and demonstrate the superior performance of BOSS on a number of applications, including Neural Architecture Search (NAS), Data Augmentation (DA), Object Detection (OD), and Reinforcement Learning (RL).
翻译:对超参数、神经结构或数据扩增政策的评价,在使用大型超参数搜索空间的高级深层学习中,已成为一个关键的模型选择问题。在本文中,我们提议在超参数搜索评估的情景中采用一个高效而有力的土匪算法,称为超光谱抽样算法(SS),通过观测子样本评估超光谱参数的潜力,理论上证明在累积遗憾标准下是最佳的。我们进一步将SS与巴伊西亚最佳化相结合,并开发出一个新的超光谱优化算法,称为BOSS。经验性研究验证了我们SS的理论论点,并展示了BOSS在一些应用方面的优异性表现,包括神经结构搜索(NAS)、数据增强(DA)、物体探测(OD)和强化学习(RL)。