Constrained optimization problems can be difficult because their search spaces have properties not conducive to search, e.g., multimodality, discontinuities, or deception. To address such difficulties, considerable research has been performed on creating novel evolutionary algorithms or specialized genetic operators. However, if the representation that defined the search space could be altered such that it only permitted valid solutions that satisfied the constraints, the task of finding the optimal would be made more feasible without any need for specialized optimization algorithms. We propose Constrained Optimization in Latent Space (COIL), which uses a VAE to generate a learned latent representation from a dataset comprising samples from the valid region of the search space according to a constraint, thus enabling the optimizer to find the objective in the new space defined by the learned representation. Preliminary experiments show promise: compared to an identical GA using a standard representation that cannot meet the constraints or find fit solutions, COIL with its learned latent representation can perfectly satisfy different types of constraints while finding high-fitness solutions.
翻译:限制的优化问题可能很困难,因为其搜索空间的属性不利于搜索,例如多式联运、不连续或欺骗。为解决这些困难,已经对创建新的进化算法或专门的遗传操作者进行了大量研究。然而,如果能够改变确定搜索空间的表述方式,使之只能允许满足这些限制的有效解决办法,那么在不需要专门优化算法的情况下,寻找最佳方法的任务将变得更加可行。我们提议在冷藏空间进行控制优化(COIL),利用VAE从由来自有效搜索空间区域样本组成的数据集中产生一个从中学习出来的潜在代表方式,根据一种制约,使优化者能够在所了解的代表所确定的新空间中找到目标。初步实验表明承诺:与使用无法满足这些限制或找到合适解决办法的标准代表方式的相同的大会相比,具有其知识潜伏代表方式的COIL在找到高适度解决方案的同时,完全能够满足不同类型的限制。