The Softmax activation layer is a very popular Deep Neural Network (DNN) component when dealing with multi-class prediction problems. However, in DNN accelerator implementations it creates additional complexities due to the need for computation of the exponential for each of its inputs. In this brief we propose a simplified version of the activation unit for accelerators, where only a comparator unit produces the classification result, by choosing the maximum among its inputs. Due to the nature of the activation function, we show that this result is always identical to the classification produced by the Softmax layer.
翻译:软体活化层是处理多级预测问题时非常受欢迎的深神经网络(DNN)的组成部分。 但是,在 DNN 加速器的安装过程中,由于需要计算每个输入的指数,它增加了复杂性。 简而言之, 我们提议一个简化的加速器启动器版本, 只有参照单位通过在输入中选择最大值来产生分类结果。 由于激活功能的性质, 我们显示这一结果总是与软体层生成的分类相同 。