For the design of optimisation algorithms that perform well in general, it is necessary to experiment with and benchmark on a range of problems with diverse characteristics. The training of neural networks is an optimisation task that has gained prominence with the recent successes of deep learning. Although evolutionary algorithms have been used for training neural networks, gradient descent variants are by far the most common choice with their trusted good performance on large-scale machine learning tasks. With this paper we contribute CORNN (Continuous Optimisation of Regression tasks using Neural Networks), a large suite that can easily be used to benchmark the performance of any continuous black-box algorithm on neural network training problems. By employing different base regression functions and neural network architectures, problem instances with different dimensions and levels of difficulty can be created. We demonstrate the use of the CORNN Suite by comparing the performance of three evolutionary and swarm-based algorithms on a set of over 300 problem instances. With the exception of random search, we provide evidence of performance complementarity between the algorithms. As a baseline, results are also provided to contrast the performance of the best population-based algorithm against a gradient-based approach (Adam). The suite is shared as a public web repository to facilitate easy integration with existing benchmarking platforms.
翻译:对于总体而言效果良好的优化算法的设计,有必要对一系列具有不同特点的问题进行实验和基准评估。神经网络的培训是一项优化任务,随着最近深层学习的成功,这一任务已变得更加突出。虽然进化算法被用于培训神经网络,但梯度下移变种是迄今为止最常见的选择,因为它在大规模机器学习任务方面表现良好,值得信赖。有了这份文件,我们提供了CORNN(利用神经网络持续优化回归任务),这是一个大型的套件,很容易用来衡量任何连续的神经网络培训问题黑盒算法的性能。通过使用不同的基础回归功能和神经网络结构,可以产生不同层面和不同程度的困难案例。我们通过比较一套300多个问题案例的三种进化和基于温的算法的性能,展示了CORN套件的使用情况。除了随机搜索之外,我们提供了各种算法之间业绩互补的证据。作为基准,还提供了将最佳人口基数计算法的性能与现有基数模型比对以共享的基数标准平台(A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-BAR-BAR-BAR-S-S-BAR-BAR-S-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-S-A-A-A-A-A-A-A-A-A-A-A-BAR-A-G-G-G-G-A-A-A-A-A-A-A-A-A-A-BAR-G-G-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-G-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A-A