Inspired by biological neurons, the activation functions play an essential part in the learning process of any artificial neural network commonly used in many real-world problems. Various activation functions have been proposed in the literature for classification as well as regression tasks. In this work, we survey the activation functions that have been employed in the past as well as the current state-of-the-art. In particular, we present various developments in activation functions over the years and the advantages as well as disadvantages or limitations of these activation functions. We also discuss classical (fixed) activation functions, including rectifier units, and adaptive activation functions. In addition to discussing the taxonomy of activation functions based on characterization, a taxonomy of activation functions based on applications is presented. To this end, the systematic comparison of various fixed and adaptive activation functions is performed for classification data sets such as the MNIST, CIFAR-10, and CIFAR- 100. In recent years, a physics-informed machine learning framework has emerged for solving problems related to scientific computations. For this purpose, we also discuss various requirements for activation functions that have been used in the physics-informed machine learning framework. Furthermore, various comparisons are made among different fixed and adaptive activation functions using various machine learning libraries such as TensorFlow, Pytorch, and JAX.
翻译:在生物神经学的启发下,激活功能在很多现实世界问题中常用的任何人工神经网络的学习过程中发挥着不可或缺的作用。文献中提出了各种激活功能,供分类和回归任务使用。在这项工作中,我们调查过去使用的激活功能以及目前的先进工艺。我们特别介绍了多年来在激活功能方面的各种发展情况以及这些激活功能的优点和缺点或局限性。我们还讨论了传统(固定)激活功能,包括校正装置和适应性启动功能。除了讨论基于特征特征的启动功能分类外,还介绍了基于应用的启动功能分类。为此,我们系统地比较了各种固定和适应性启动功能,用于分类数据集,如MNIST、CIFAR-10和CIFAR-100等。近年来,为解决科学计算方面的问题,出现了一个了解物理学的机器学习框架。为此,我们还讨论了在物理智能机器学习框架内使用的激活功能的各种要求。此外,对J型机器学习框架进行了各种固定的升级和升级功能进行了系统比较。此外,对J型机器学习框架进行了各种固定的升级和升级。