Activation functions (AFs) play a pivotal role in the performance of neural networks. The Rectified Linear Unit (ReLU) is currently the most commonly used AF. Several replacements to ReLU have been suggested but improvements have proven inconsistent. Some AFs exhibit better performance for specific tasks, but it is hard to know a priori how to select the appropriate one(s). Studying both standard fully connected neural networks (FCNs) and convolutional neural networks (CNNs), we propose a novel, three-population, coevolutionary algorithm to evolve AFs, and compare it to four other methods, both evolutionary and non-evolutionary. Tested on four datasets -- MNIST, FashionMNIST, KMNIST, and USPS -- coevolution proves to be a performant algorithm for finding good AFs and AF architectures.
翻译:在神经网络的运行中,激活功能(AFs)发挥着关键作用。修正线条单位(RELU)是目前最常用的AF。建议了一些替换ReLU的功能,但改进却前后不一。有些AFs在具体任务方面表现较好,但很难事先知道如何选择合适的功能。研究标准完全连通的神经网络(FCNs)和动态神经网络(CNNs),我们提议一种新颖的、三组化的、共同进化算法,以发展AFs,并将其与其他四种方法进行比较,包括进化法和非进化法。测试了四种数据集 -- -- MMIST、时尚MINIS、KMNIST、USPS -- -- 演进证明是寻找良好的AFs和AFF结构的一种演艺算法。