Optimizing multiple competing objectives is a common problem across science and industry. The inherent inextricable trade-off between those objectives leads one to the task of exploring their Pareto front. A meaningful quantity for the purpose of the latter is the hypervolume indicator, which is used in Bayesian Optimization (BO) and Evolutionary Algorithms (EAs). However, the computational complexity for the calculation of the hypervolume scales unfavorably with increasing number of objectives and data points, which restricts its use in those common multi-objective optimization frameworks. To overcome these restrictions we propose to approximate the hypervolume function with a deep neural network, which we call DeepHV. For better sample efficiency and generalization, we exploit the fact that the hypervolume is scale-equivariant in each of the objectives as well as permutation invariant w.r.t. both the objectives and the samples, by using a deep neural network that is equivariant w.r.t. the combined group of scalings and permutations. We evaluate our method against exact, and approximate hypervolume methods in terms of accuracy, computation time, and generalization. We also apply and compare our methods to state-of-the-art multi-objective BO methods and EAs on a range of synthetic benchmark test cases. The results show that our methods are promising for such multi-objective optimization tasks.
翻译:优化多种相互竞争的目标是整个科学和产业的一个共同问题。这些目标之间固有的不可分割的权衡取舍导致探索其Pareto战线的任务。对于后者来说,一个有意义的数量是高容量指标,该指标用于巴伊西亚优化(BO)和进化代谢(EAs),但是,计算超容量尺度的计算复杂性,与数量不断增加的目标和数据点不相称,这限制了其在这些共同的多目标优化框架中的使用。为了克服这些限制,我们提议用一个深神经网络(我们称之为DeepHV)来将超容量功能相近。为了提高采样效率和概括性,我们利用以下事实:超容量是每个目标中的规模不平等的,以及变异性W.r.t.计算目标和样本的易变性,使用一个不均匀的神经网络,这限制了它在这些共同的多目标优化框架中的应用。我们用一种方法来比对高容量功能进行近似超容量的神经网络,我们称之为EmploeHV。我们利用了精确度的方法,并且将高容量方法在精确性水平、精确度的合成测试方法中,也应用了我们的精确度测试方法,并展示了BOIAL-B的多级测试方法。