Activation functions shape the outputs of artificial neurons and, therefore, are integral parts of neural networks in general and deep learning in particular. Some activation functions, such as logistic and relu, have been used for many decades. But with deep learning becoming a mainstream research topic, new activation functions have mushroomed, leading to confusion in both theory and practice. This paper provides an analytic yet up-to-date overview of popular activation functions and their properties, which makes it a timely resource for anyone who studies or applies neural networks.
翻译:激活功能决定了人工神经元的输出,因此,是一般神经网络的组成部分,特别是深层学习。有些激活功能,如物流和再生功能,已经使用了数十年。但是随着深层学习成为主流研究课题,新的激活功能出现了,在理论和实践上都造成了混乱。本文对大众激活功能及其特性进行了分析性的、最新的概述,从而成为研究或应用神经网络的任何人的及时资源。