Convolutional neural networks (CNNs) are a representative class of deep learning algorithms including convolutional computation that perform translation-invariant classification of input data based on their hierarchical architecture. However, classical convolutional neural network learning methods use the steepest descent algorithm for training, and the learning performance is greatly influenced by the initial weight settings of the convolutional and fully connected layers, requiring re-tuning to achieve better performance under different model structures and data. Combining the strengths of the simulated annealing algorithm in global search, we propose applying it to the hyperparameter search process in order to increase the effectiveness of convolutional neural networks (CNNs). In this paper, we introduce SA-CNN neural networks for text classification tasks based on Text-CNN neural networks and implement the simulated annealing algorithm for hyperparameter search. Experiments demonstrate that we can achieve greater classification accuracy than earlier models with manual tuning, and the improvement in time and space for exploration relative to human tuning is substantial.
翻译:进化神经网络(CNNs)是具有代表性的深层次学习算法,包括根据等级结构对输入数据进行翻译和变异分类的进化计算,但古老的进化神经网络学习方法使用最陡峭的下层算法进行培训,学习表现受到进化和完全连接层的初始重量设置的极大影响,需要重新校正,以便在不同的模型结构和数据下取得更好的性能。结合模拟肛门算法在全球搜索中的优势,我们提议将它应用到超参数搜索过程,以提高进化神经网络(CNNs)的效能。在本论文中,我们采用SA-CNN神经网络进行基于文本-CNN神经网络的文本分类任务,并采用模拟的超参数搜索读取算法。实验表明,我们可以通过人工调整实现比早期模型更高的分类准确性,而探索的时间和空间相对于人类调整而言有很大改进。</s>