A high efficiency hardware integration of neural networks benefits from realizing nonlinearity, network connectivity and learning fully in a physical substrate. Multiple systems have recently implemented some or all of these operations, yet the focus was placed on addressing technological challenges. Fundamental questions regarding learning in hardware neural networks remain largely unexplored. Noise in particular is unavoidable in such architectures, and here we investigate its interaction with a learning algorithm using an opto-electronic recurrent neural network. We find that noise strongly modifies the system's path during convergence, and surprisingly fully decorrelates the final readout weight matrices. This highlights the importance of understanding architecture, noise and learning algorithm as interacting players, and therefore identifies the need for mathematical tools for noisy, analogue system optimization.
翻译:神经网络的高效硬件整合,从实现非线性、网络连通性和在物理基质中充分学习中受益。多个系统最近实施了部分或所有这些操作,但重点是应对技术挑战。硬件神经网络中学习的基本问题基本上尚未探讨。在这种结构中,噪音尤其不可避免,我们在这里调查它与使用超电子经常性神经网络的学习算法的互动。我们发现,噪音强烈地改变了系统在趋同过程中的路径,令人惊讶地完全调整了最终读出重量矩阵。这凸显了理解结构、噪音和学习算法作为互动玩家的重要性,并因此确定需要数学工具来进行吵闹的模拟系统优化。