Brains have evolved a diverse set of neurons with varying morphologies, physiological properties and rich dynamics that impact their processing of temporal information. By contrast, most neural network models include a homogeneous set of units that only vary in terms of their spatial parameters (weights and biases). To investigate the importance of temporal parameters to neural function, we trained spiking neural networks on tasks of varying temporal complexity, with different subsets of parameters held constant. We find that in a tightly resource constrained setting, adapting conduction delays is essential to solve all test conditions, and indeed that it is possible to solve these tasks using only temporal parameters (delays and time constants) with weights held constant. In the most complex spatio-temporal task we studied, we found that an adaptable bursting parameter was essential. More generally, allowing for adaptation of both temporal and spatial parameters increases network robustness to noise, an important feature for both biological brains and neuromorphic computing systems. In summary, our findings highlight how rich and adaptable dynamics are key to solving temporally structured tasks at a low neural resource cost, which may be part of the reason why biological neurons vary so dramatically in their physiological properties.
翻译:暂无翻译