Despite the recent success of artificial neural networks, more biologically plausible learning methods may be needed to resolve the weaknesses of backpropagation trained models such as catastrophic forgetting and adversarial attacks. Although these weaknesses are not specifically addressed, a novel local learning rule is presented that performs online clustering with an upper limit on the number of clusters to be found rather than a fixed cluster count. Instead of using orthogonal weight or output activation constraints, activation sparsity is achieved by mutual repulsion of lateral Gaussian neurons ensuring that multiple neuron centers cannot occupy the same location in the input domain. An update method is also presented for adjusting the widths of the Gaussian neurons in cases where the data samples can be represented by means and variances. The algorithms were applied on the MNIST and CIFAR-10 datasets to create filters capturing the input patterns of pixel patches of various sizes. The experimental results demonstrate stability in the learned parameters across a large number of training samples.
翻译:尽管人造神经网络最近取得了成功,但可能需要采用更具有生物依据的学习方法来解决诸如灾难性的遗忘和对抗性攻击等经培训的反向传播模型的弱点。虽然这些弱点没有具体解决,但提出了一种新的本地学习规则,即在线集群,对要找到的组群数量设定上限,而不是固定的群集计。不使用正方形重量或产出激活限制,而是通过相互排斥横向高斯神经元,确保多个神经中心不能占据输入域的同一位置,从而实现激活聚变。还提出了在数据样本可以以手段和差异表示的情况下调整高斯神经元宽度的更新方法。算法应用于MNIST和CIFAR-10数据集,以创建过滤器,捕捉不同尺寸的像素补补的输入模式。实验结果显示,大量培训样品的学习参数具有稳定性。