Neural networks adapt very well to distributed and continuous representations, but struggle to generalize from small amounts of data. Symbolic systems commonly achieve data efficient generalization by exploiting modularity to benefit from local and discrete features of a representation. These features allow symbolic programs to be improved one module at a time and to experience combinatorial growth in the values they can successfully process. However, it is difficult to design a component that can be used to form symbolic abstractions and which is adequately overparametrized to learn arbitrary high-dimensional transformations. I present Graph-based Symbolically Synthesized Neural Networks (G-SSNNs), a class of neural modules that operate on representations modified with synthesized symbolic programs to include a fixed set of local and discrete features. I demonstrate that the choice of injected features within a G-SSNN module modulates the data efficiency and generalization of baseline neural models, creating predictable patterns of both heightened and curtailed generalization. By training G-SSNNs, we also derive information about desirable semantics of symbolic programs without manual engineering. This information is compact and amenable to abstraction, but can also be flexibly recontextualized for other high-dimensional settings. In future work, I will investigate data efficient generalization and the transferability of learned symbolic representations in more complex G-SSNN designs based on more complex classes of symbolic programs. Experimental code and data are available at https://github.com/shlomenu/symbolically_synthesized_networks .
翻译:符号系统通常通过利用模块化来实现数据高效的概括化,从而从一个代表的本地和离散特性中获益。这些特征使得象征性程序可以一次改进一个模块,并经历它们能够成功处理的数值的组合式增长。然而,很难设计一个可以用来形成象征性抽象抽象的、足够超分化以学习任意的高维变异的构件。我介绍了基于图形的符号合成神经网络(G-SSNNNS),这是一组神经模块,在经合成符号程序修改的表达式上运行,以包含一套固定的本地和离散特性。我证明在G-SSNNN模块中选择注入的特性可以调节数据效率和基线神经模型的总体化,从而创造可预见的强化和缩减的概括化模式。通过培训G-SSNNNS,我们还可以获取关于没有手动工程的象征性合成神经网络(G-SSNNNN)的适宜结构的信息。这一信息是压缩的,可以用于抽象和易变现的缩略性数据结构。在IMI-imalalimalalalalalalalalalalisalal rogration Stal-destration laction laction squistration lap laphyal laphyal rodududududududustration smations</s>