Recently, significant progress has been made regarding the statistical understanding of artificial neural networks (ANNs). ANNs are motivated by the functioning of the brain, but differ in several crucial aspects. In particular, the locality in the updating rule of the connection parameters in biological neural networks (BNNs) makes it biologically implausible that the learning of the brain is based on gradient descent. In this work, we look at the brain as a statistical method for supervised learning. The main contribution is to relate the local updating rule of the connection parameters in BNNs to a zero-order optimization method. It is shown that the expected values of the iterates implement a modification of gradient descent.
翻译:近年来,关于人工神经网络(ANN)的统计理解取得了重大进展。ANN受大脑功能的启发,但在几个关键方面存在差异。特别是,生物神经网络(BNN)中连接参数的局部更新规则使得基于梯度下降的学习在生物学上难以实现。在这项工作中,我们将大脑视为一种有监督学习的统计方法。主要贡献是将BNN中连接参数的局部更新规则与零阶优化方法联系起来。结果显示,迭代的期望值实现了梯度下降的修改。