Designing optimisation algorithms that perform well in general requires experimentation on a range of diverse problems. Training neural networks is an optimisation task that has gained prominence with the recent successes of deep learning. Although evolutionary algorithms have been used for training neural networks, gradient descent variants are by far the most common choice with their trusted good performance on large-scale machine learning tasks. With this paper we contribute CORNN (Continuous Optimisation of Regression tasks using Neural Networks), a large suite for benchmarking the performance of any continuous black-box algorithm on neural network training problems. Using a range of regression problems and neural network architectures, problem instances with different dimensions and levels of difficulty can be created. We demonstrate the use of the CORNN Suite by comparing the performance of three evolutionary and swarm-based algorithms on over 300 problem instances, showing evidence of performance complementarity between the algorithms. As a baseline, the performance of the best population-based algorithm is benchmarked against a gradient-based approach. The CORNN suite is shared as a public web repository to facilitate easy integration with existing benchmarking platforms.
翻译:培训神经网络是一项优化的任务,随着最近深层学习的成功而变得十分突出。虽然在培训神经网络时使用了进化算法,但梯度下位变量是迄今为止最常见的选择,而其相信大规模机器学习任务的良好性能也是最常见的选择。我们通过本文件贡献了CORNN(利用神经网络持续优化递减任务),这是衡量神经网络培训问题任何连续黑盒算法绩效的大型套件。利用一系列回归问题和神经网络结构,可以产生不同层面和不同程度的困难问题。我们通过比较300多个问题案例中三种进化和以群态为基础的算法的性能,展示了这些算法之间业绩互补的证据,以此为基准,根据基于梯度的方法衡量最佳人口算法的性能。CORN套件作为公共网络库共享,以便利与现有基准平台的简单整合。