This article presents a "Hybrid Self-Attention NEAT" method to improve the original NeuroEvolution of Augmenting Topologies (NEAT) algorithm in high-dimensional inputs. Although the NEAT algorithm has shown a significant result in different challenging tasks, as input representations are high dimensional, it cannot create a well-tuned network. Our study addresses this limitation by using self-attention as an indirect encoding method to select the most important parts of the input. In addition, we improve its overall performance with the help of a hybrid method to evolve the final network weights. The main conclusion is that Hybrid Self- Attention NEAT can eliminate the restriction of the original NEAT. The results indicate that in comparison with evolutionary algorithms, our model can get comparable scores in Atari games with raw pixels input with a much lower number of parameters.
翻译:本文介绍了一种“ 高度自我注意 NEAT ” 方法, 用于改进高维投入中原有的强化地形学神经进化算法( NEAT ) 。 虽然 NEAT 算法显示,由于输入表示具有高度的高度, 已经取得了不同挑战性任务的重大结果, 但是它无法创建出一个协调良好的网络。 我们的研究通过使用自我注意作为间接编码方法来选择输入的最重要部分来解决这一局限性。 此外, 在混合方法的帮助下, 我们改进了总体性能, 以发展最终网络重量。 主要的结论是, 混合自我注意 NEAT 能够消除原始 NEAT 的限制 。 结果显示, 与进化算法相比, 我们的模型可以在阿塔里游戏中获得相似的分数, 其原始像素输入的分数要低得多。