We present a new class of neurons, ARNs, which give a cross entropy on test data that is up to three times lower than the one achieved by carefully optimized LSTM neurons. The explanations for the huge improvements that often are achieved are elaborate skip connections through time, up to four internal memory states per neuron and a number of novel activation functions including small quadratic forms. The new neurons were generated using automatic programming and are formulated as pure functional programs that easily can be transformed. We present experimental results for eight datasets and found excellent improvements for seven of them, but LSTM remained the best for one dataset. The results are so promising that automatic programming to generate new neurons should become part of the standard operating procedure for any machine learning practitioner who works on time series data such as sensor signals.
翻译:我们提出了一个新的神经元类别,即ARN, 它在测试数据上提供了一种交叉倍数,比仔细优化LSTM神经元所实现的数据低三倍。 通常实现的巨大改进的解释是精心设计时空连接,每个神经元最多有四个内部内存状态,以及一些新型激活功能,包括小四面形。 新的神经元是使用自动编程生成的,并形成为简单易变的纯功能程序。 我们为8个数据集提供了实验结果,发现其中7个数据集有极好的改进,但LSTM对于一个数据集来说仍然是最好的。 结果表明,生成新神经元的自动编程应该成为任何在传感器信号等时间序列数据上工作的机器学习从业者的标准操作程序的一部分。