Neural networks are typically represented as data structures that are traversed either through iteration or by manual chaining of method calls. However, a deeper analysis reveals that structured recursion can be used instead, so that traversal is directed by the structure of the network itself. This paper shows how such an approach can be realised in Haskell, by encoding neural networks as recursive data types, and then their training as recursion scheme patterns. In turn, we promote a coherent implementation of neural networks that delineates between their structure and semantics, allowing for compositionality in both how they are built and how they are trained.
翻译:神经网络通常被描述为通过迭代或人工串联方法电话而穿行的数据结构。然而,更深入的分析表明,可以使用结构重现,这样,跨线由网络本身的结构来指导。本文展示了这种方法如何在哈斯凯尔实现,将神经网络编码为循环数据类型,然后将神经网络培训为循环系统模式。反过来,我们促进协调一致地实施神经网络,将神经网络的结构与语义区分开来,在网络的构建方式和训练方式上都允许这种网络的构成性。