We consider the problem of efficient blackbox optimization over a large hybrid search space, consisting of a mixture of a high dimensional continuous space and a complex combinatorial space. Such examples arise commonly in evolutionary computation, but also more recently, neuroevolution and architecture search for Reinforcement Learning (RL) policies. Unfortunately however, previous mutation-based approaches suffer in high dimensional continuous spaces both theoretically and practically. We thus instead propose ES-ENAS, a simple joint optimization procedure by combining Evolutionary Strategies (ES) and combinatorial optimization techniques in a highly scalable and intuitive way, inspired by the one-shot or supernet paradigm introduced in Efficient Neural Architecture Search (ENAS). Through this relatively simple marriage between two different lines of research, we are able to gain the best of both worlds, and empirically demonstrate our approach by optimizing BBOB functions over hybrid spaces as well as combinatorial neural network architectures via edge pruning and quantization on popular RL benchmarks. Due to the modularity of the algorithm, we also are able incorporate a wide variety of popular techniques ranging from use of different continuous and combinatorial optimizers, as well as constrained optimization.
翻译:我们考虑了在大型混合搜索空间上高效黑匣子优化的问题,包括高维连续空间和复杂的组合空间的混合体。这些例子通常出现在进化计算中,但更近一些时候,神经进化和建筑搜索强化学习政策。但不幸的是,以往的突变方法在高维连续空间中在理论和实践上都存在问题。因此,我们提出了ES-ENAS,这是一个简单的联合优化程序,将进化战略(ES)和组合优化技术以高度可缩放和直观的方式结合在一起,受到高效神经结构搜索(ENAS)中引入的一射或超级网络范式的启发。通过两种不同研究线之间的相对简单的结合,我们能够获得两个世界的最佳成果,并且通过优化混合空间的BBOB功能,以及通过边际剪裁和按流行的RL基准四分化组合神经网络结构,以实验性的方式展示了我们的方法。由于算法的模块性很强,我们还能够吸收广泛的流行技术,从使用不同连续和组合优化的优化到限制的优化。