We propose a novel algorithm called Backpropagation Neural Tree (BNeuralT), which is a stochastic computational dendritic tree. BNeuralT takes random repeated inputs through its leaves and imposes dendritic nonlinearities through its internal connections like a biological dendritic tree would do. Considering the dendritic-tree like plausible biological properties, BNeuralT is a single neuron neural tree model with its internal sub-trees resembling dendritic nonlinearities. BNeuralT algorithm produces an ad hoc neural tree which is trained using a stochastic gradient descent optimizer like gradient descent (GD), momentum GD, Nesterov accelerated GD, Adagrad, RMSprop, or Adam. BNeuralT training has two phases, each computed in a depth-first search manner: the forward pass computes neural tree's output in a post-order traversal, while the error backpropagation during the backward pass is performed recursively in a pre-order traversal. A BNeuralT model can be considered a minimal subset of a neural network (NN), meaning it is a "thinned" NN whose complexity is lower than an ordinary NN. Our algorithm produces high-performing and parsimonious models balancing the complexity with descriptive ability on a wide variety of machine learning problems: classification, regression, and pattern recognition.
翻译:我们提议了一个叫回伸进神经树(NeuralT)的新算法,这个算法叫做“回伸伸伸进神经树(BeuralT), 是一种随机的反复输入, 通过其叶子通过树叶随机反复输入, 并且通过其内部连接, 像生物的登地树那样, 将非直线性硬化。 考虑到登地树, 似似似似似似生物生物性质, BneuralT 是一个单一的神经神经树模型, 其内部的亚树类似登地非线性。 BneuralT 算法产生一种临时的神经树, 该树是使用一种随机梯度梯度下降优化(GD)、 动力GD、 Nesterov 加速的GD、 Adagrad、 RMSprop 或 Adam等的优化来训练。 B neuralT 培训分为两个阶段, 每一个阶段都是以深度搜索方式计算的: 前传动性神经树的输出, 而后期的错误又重新调整的演化过程, 可以在前的分类中反复进行。 一种最低的变式的演化的演算模型, 意思是: 它的精化的精化的精化的演化的精制的精制的精制的精制的模型。