We propose the Gaussian Error Linear Unit (GELU), a high-performing neural network activation function. The GELU nonlinearity is the expected transformation of a stochastic regularizer which randomly applies the identity or zero map to a neuron's input. The GELU nonlinearity weights inputs by their magnitude, rather than gates inputs by their sign as in ReLUs. We perform an empirical evaluation of the GELU nonlinearity against the ReLU and ELU activations and find performance improvements across all considered computer vision, natural language processing, and speech tasks.
翻译:我们建议高斯误差线性单元(GELU),这是一个高性能神经网络激活功能。GELU的无线性是随机将身份或零映射图随机应用于神经输入的随机随机随机随机随机转换的随机随机调节器。 GOLU的非线性加权输入量按其大小,而不是按ReLUs的标记输入门输入量。我们对GELU对RELU和ELU启动的非线性进行实证评估,发现所有考虑的计算机视觉、自然语言处理和语音任务的业绩改进。