This paper presents a new algorithm, Evolutionary eXploration of Augmenting Memory Models (EXAMM), which is capable of evolving recurrent neural networks (RNNs) using a wide variety of memory structures, such as Delta-RNN, GRU, LSTM, MGU and UGRNN cells. EXAMM evolved RNNs to perform prediction of large-scale, real world time series data from the aviation and power industries. These data sets consist of very long time series (thousands of readings), each with a large number of potentially correlated and dependent parameters. Four different parameters were selected for prediction and EXAMM runs were performed using each memory cell type alone, each cell type with feed forward nodes, and with all possible memory cell types. Evolved RNN performance was measured using repeated k-fold cross validation, resulting in 1210 EXAMM runs which evolved 2,420,000 RNNs in 12,100 CPU hours on a high performance computing cluster. Generalization of the evolved RNNs was examined statistically, providing interesting findings that can help refine the RNN memory cell design as well as inform future neuro-evolution algorithms development.
翻译:本文介绍了一种新的算法,即增强记忆模型的进化式超增量模型(EXAMM),该算法能够利用诸如德尔塔-RNN、GRU、LSTM、MGU和UGNN等各种记忆结构,不断演进经常性神经网络(RNN),利用诸如德尔塔-RNN、GRU、LSTM、MGU、MGU和UGRNN等各种记忆结构。EXAMM进化了RNN,对航空和动力工业的大规模、真实世界时间序列数据进行预测。这些数据组由非常长的时间序列(千次读数)组成,每个数据集都具有大量潜在的关联性和依赖性参数。为预测选择了四种不同的参数,而EXAM运行过程仅使用每个记忆细胞类型,每个细胞类型都配备前方节点,并使用所有可能的记忆细胞类型。EvolveRNNN的性能是用重复的千倍交叉验证来测量的,结果产生了1210个EXAMM运行过程,在12 100个高性计算机计算组上演化了2,2,2,20,000 RNNNNNNNS的12,000个运行在12,从高性计算组中演变中演变,从统计学上进行了全面分析,从统计,提供了有趣的结果,提供了有助于改进RNNNNNE的有趣,提供了有趣的结果,有助于改进。