In this paper, we approach the problem of optimizing blackbox functions over large hybrid search spaces consisting of both combinatorial and continuous parameters. We demonstrate that previous evolutionary algorithms which rely on mutation-based approaches, while flexible over combinatorial spaces, suffer from a curse of dimensionality in high dimensional continuous spaces both theoretically and empirically, which thus limits their scope over hybrid search spaces as well. In order to combat this curse, we propose ES-ENAS, a simple and modular joint optimization procedure combining the class of sample-efficient smoothed gradient gradient techniques, commonly known as Evolutionary Strategies (ES), with combinatorial optimizers in a highly scalable and intuitive way, inspired by the one-shot or supernet paradigm introduced in Efficient Neural Architecture Search (ENAS). By doing so, we achieve significantly more sample efficiency, which we empirically demonstrate over synthetic benchmarks, and are further able to apply ES-ENAS for architecture search over popular RL benchmarks.
翻译:在本文中,我们处理在由组合和连续参数组成的大型混合搜索空间优化黑盒功能的问题。我们证明,以往依赖突变法的进化算法,虽然在组合空间上是灵活的,但在高维连续空间中,在理论上和经验上都受到高维连续空间的维度诅咒,从而限制其在混合搜索空间的范围。为了消除这一诅咒,我们提出了ES-ENAS,这是一个简单和模块化的联合优化程序,将样本高效的平滑梯度技术(通常称为进化战略(ES))与组合优化器以高度可缩放和直观的方式结合在一起,受高效神经结构搜索(ENAS)中引入的一发或超级网络范式的启发。 通过这样做,我们大大提高了样本效率,我们从经验上证明它超越了合成基准,并且能够进一步应用ES-ENAS进行建筑搜索,超越流行的RL基准。