Inspired by biological neurons, the activation functions play an essential part in the learning process of any artificial neural network commonly used in many real-world problems. Various activation functions have been proposed in the literature for classification as well as regression tasks. In this work, we survey the activation functions that have been employed in the past as well as the current state-of-the-art. In particular, we present various developments in activation functions over the years and the advantages as well as disadvantages or limitations of these activation functions. We also discuss classical (fixed) activation functions, including rectifier units, and adaptive activation functions. In addition to presenting the taxonomy of activation functions based on characterization, a taxonomy of activation functions based on applications is also presented. To this end, the systematic comparison of various fixed and adaptive activation functions is performed for classification data sets such as the MNIST, CIFAR-10, and CIFAR-100. In recent years, a physics-informed machine learning framework has emerged for solving problems related to scientific computations. To this purpose, we also discuss various requirements for activation functions that have been used in the physics-informed machine learning framework. Furthermore, various comparisons are made among different fixed and adaptive activation functions using various machine learning libraries such as TensorFlow, Pytorch, and JAX.
翻译:在生物神经学的启发下,激活功能在很多现实世界问题中常用的任何人工神经网络的学习过程中发挥着不可或缺的作用。文献中提出了各种激活功能,供分类和回归任务使用。在这项工作中,我们调查过去使用的激活功能以及目前的最新状态。我们特别介绍了多年来在激活功能方面的各种发展情况以及这些激活功能的优点和缺点或局限性。我们还讨论了传统(固定)启动功能,包括校正装置和适应性启动功能。除了根据特征描述提出启动功能的分类外,还介绍了基于应用的启动功能分类。为此,我们系统地比较了各种固定和适应性启动功能,用于分类数据集,如MNIST、CIFAR-10和CIFAR-100。近年来,出现了一个了解物理学的机器学习框架,以解决与科学计算有关的问题。为此,我们还讨论了在物理学-知情机器学习框架内使用的激活功能的各种要求。此外,在物理学-智能机器学习框架内,对J-A的适应和适应性A等固定学习框架进行了各种比较。此外,在各种功能之间进行了不同的比较,例如,在对J-K-A的改造和感光学数据库进行了不同的调整。