Deep Neural Networks (DNN's) are a widely-used solution for a variety of machine learning problems. However, it is often necessary to invest a significant amount of a data scientist's time to pre-process input data, test different neural network architectures, and tune hyper-parameters for optimal performance. Automated machine learning (autoML) methods automatically search the architecture and hyper-parameter space for optimal neural networks. However, current state-of-the-art (SOTA) methods do not include traditional methods for manipulating input data as part of the algorithmic search space. We adapt the Evolutionary Multi-objective Algorithm Design Engine (EMADE), a multi-objective evolutionary search framework for traditional machine learning methods, to perform neural architecture search. We also integrate EMADE's signal processing and image processing primitives. These primitives allow EMADE to manipulate input data before ingestion into the simultaneously evolved DNN. We show that including these methods as part of the search space shows potential to provide benefits to performance on the CIFAR-10 image classification benchmark dataset.
翻译:深神经网络(DNN)是广泛使用的解决各种机器学习问题的办法。然而,通常需要投入大量数据科学家的时间来进行预处理输入数据、测试不同的神经网络结构、为最佳性能调控超参数。自动机学习(自动)方法自动搜索建筑和超参数空间,以建立最佳神经网络。然而,目前最先进的方法并不包括作为算法搜索空间一部分的操纵输入数据的传统方法。我们调整进化多目标阿尔戈里特姆设计引擎(EMADE),这是一个用于传统机器学习方法的多目的进化搜索框架,以进行神经结构搜索。我们还将EMADE的信号处理和图像处理原始数据整合在一起。这些原始方法允许EMADE在将输入数据引入同时演化的DNN(DN)。我们显示,将这些方法作为搜索空间的一部分,有可能为CIFAR-10图像分类基准数据集的性能带来效益。