A deep neural network for classification tasks is essentially consist of two components: feature extractors and function approximators. They usually work as an integrated whole, however, improvements on any components can promote the performance of the whole algorithm. This paper focus on designing a new function approximator. Conventionally, to build a function approximator, one usually uses the method based on the nonlinear activation function or the nonlinear kernel function and yields classical networks such as the feed-forward neural network (MLP) and the radial basis function network (RBF). In this paper, a new function approximator that is effective and efficient is proposed. Instead of designing new activation functions or kernel functions, the new proposed network uses the fractional form. For the sake of convenience, we name the network the ratio net. We compare the effectiveness and efficiency of the ratio net and that of the RBF and the MLP with various kinds of activation functions in the classification task on the mnist database of handwritten digits and the Internet Movie Database (IMDb) which is a binary sentiment analysis dataset. It shows that, in most cases, the ratio net converges faster and outperforms both the MLP and the RBF.
翻译:用于分类任务的深神经网络基本上由两个组成部分组成:地貌提取器和功能近似器。 它们通常作为一个整体工作, 但是, 任何部件的改进可以促进整个算法的性能。 本文侧重于设计一个新的功能近似器。 常规上, 要构建一个功能近似器, 通常使用基于非线性激活功能或非线性内核功能的方法, 并生成传统网络, 如饲料前神经网络 和辐射基功能网络 。 在本文中, 提出了一个新的功能近似器, 能够提高整个算法的性能。 新的拟议网络不是设计新的激活功能或内核功能, 而是使用分数格式。 为了方便起见, 我们命名网络的比重, 以及 RBF 和 MLP 的效益和效率, 我们比较了Mnnist 数字数据库和 互联网电影数据库( IMDb) 的分类任务中的各种激活功能。 它不是设计新的激活功能, 而是设计新的激活功能或内核功能功能功能,, 新的网络使用分数格式 和MFM 的比较, 它显示最快速的组合式数据比 。