We consider the problem of efficient blackbox optimization over a large hybrid search space, consisting of a mixture of a high dimensional continuous space and a complex combinatorial space. Such examples arise commonly in evolutionary computation, but also more recently, neuroevolution and architecture search for Reinforcement Learning (RL) policies. In this paper, we introduce ES-ENAS, a simple joint optimization procedure by combining Evolutionary Strategies (ES) and combinatorial optimization techniques in a highly scalable and intuitive way, inspired by the \textit{one-shot} or \textit{supernet} paradigm introduced in Efficient Neural Architecture Search (ENAS). Our main insight is noticing that ES is already a highly distributed algorithm involving hundreds of blackbox evaluations which can not only be used for training neural network weights, but also for feedback to a combinatorial optimizer. Through this relatively simple marriage between two different lines of research, we are able to gain the best of both worlds, and empirically demonstrate our approach by optimizing BBOB functions over hybrid spaces as well as combinatorial neural network architectures via edge pruning and quantization on popular RL benchmarks. Due to the modularity of the algorithm, we also are able incorporate a wide variety of popular techniques ranging from use of different continuous and combinatorial optimizers, as well as constrained optimization.
翻译:我们考虑在大型混合搜索空间上高效黑匣子优化的问题,包括高维连续空间和复杂组合空间的混合体。这些例子通常出现在进化计算中,但更近些时,神经进化和建筑搜索强化学习(RL)政策。在本文中,我们引入了ES-ENAS,这是一个简单的联合优化程序,将进化战略(ES)和组合优化技术结合到高度可缩放和直观的方法中,受到\ textit{one-shot} 或\ textit{supernet} 模式的启发,引入了高效神经结构搜索(ENAS)中。我们的主要见解是,ES已经是一种高度分布的算法,涉及数百个黑盒评估,不仅可用于培训神经网络重量,而且可用于向组合优化者反馈。通过两种不同研究线之间的相对简单的结合,我们能够从混合空间优化BBOB功能以及通过精准神经网络结构搜索(ENAS ) 。我们的主要洞察觉觉觉发现,ES已经是一种高度分布式的计算方法,它不仅可用于培训神经箱重量,而且还可以用于向组合优化的组合优化技术,还用于向组合优化的组合优化。