Behavior can be described as a temporal sequence of actions driven by neural activity. To learn complex sequential patterns in neural networks, memories of past activities need to persist on significantly longer timescales than the relaxation times of single-neuron activity. While recurrent networks can produce such long transients, training these networks is a challenge. Learning via error propagation confers models such as FORCE, RTRL or BPTT a significant functional advantage, but at the expense of biological plausibility. While reservoir computing circumvents this issue by learning only the readout weights, it does not scale well with problem complexity. We propose that two prominent structural features of cortical networks can alleviate these issues: the presence of a certain network scaffold at the onset of learning and the existence of dendritic compartments for enhancing neuronal information storage and computation. Our resulting model for Efficient Learning of Sequences (ELiSe) builds on these features to acquire and replay complex non-Markovian spatio-temporal patterns using only local, always-on and phase-free synaptic plasticity. We showcase the capabilities of ELiSe in a mock-up of birdsong learning, and demonstrate its flexibility with respect to parametrization, as well as its robustness to external disturbances.
翻译:暂无翻译