Recently, much attention has been devoted to finding highly efficient and powerful activation functions for CNN layers. Because activation functions inject different nonlinearities between layers that affect performance, varying them is one method for building robust ensembles of CNNs. The objective of this study is to examine the performance of CNN ensembles made with different activation functions, including six new ones presented here: 2D Mexican ReLU, TanELU, MeLU+GaLU, Symmetric MeLU, Symmetric GaLU, and Flexible MeLU. The highest performing ensemble was built with CNNs having different activation layers that randomly replaced the standard ReLU. A comprehensive evaluation of the proposed approach was conducted across fifteen biomedical data sets representing various classification tasks. The proposed method was tested on two basic CNN architectures: Vgg16 and ResNet50. Results demonstrate the superiority in performance of this approach. The MATLAB source code for this study will be available at https://github.com/LorisNanni.
翻译:最近,人们非常关注如何找到有线电视新闻网各层高度高效和强大的激活功能。由于激活功能在影响性能的各层之间注入了不同的非线性,因此,不同功能是建立有线电视新闻网强大组合的一种方法。本研究的目的是审查有线电视新闻网以不同激活功能组成的组合的运作情况,包括在此介绍的6个新组合:2D Mexican ReLU、TanELU、MELU+GaLU、Symitim MeLU、Symtimit GALU和可变MELU。有线电视新闻网创建了最高性能合体。有线电视新闻网有不同的激活层,随机取代了标准ReLU。对拟议的方法进行了跨15套生物医学数据集的全面评价,代表了各种分类任务。拟议方法在有线电视新闻网的两个基本结构上进行了测试:Vgg16和ResNet50。结果显示,该方法的优越性。该方法的MATLAB源代码将在https://github.com/Lorisnni中查阅。