In this study, we build upon a previously proposed neuroevolution framework to evolve deep convolutional models. Specifically, the genome encoding and the crossover operator are extended to make them applicable to layered networks. We also propose a convolutional layer layout which allows kernels of different shapes and sizes to coexist within the same layer, and present an argument as to why this may be beneficial. The proposed layout enables the size and shape of individual kernels within a convolutional layer to be evolved with a corresponding new mutation operator. The proposed framework employs a hybrid optimisation strategy involving structural changes through epigenetic evolution and weight update through backpropagation in a population-based setting. Experiments on several image classification benchmarks demonstrate that the crossover operator is sufficiently robust to produce increasingly performant offspring even when the parents are trained on only a small random subset of the training dataset in each epoch, thus providing direct confirmation that learned features and behaviour can be successfully transferred from parent networks to offspring in the next generation.
翻译:在此研究中,我们以先前提出的神经进化框架为基础,发展深层进化模型。 具体地说, 基因组编码和交叉运算符被扩展, 使之适用于分层网络。 我们还提议了一个进化层层布局, 允许不同形状和大小的内核在同一层内共存, 并论证为什么这样做可能有益。 拟议的布局使得在进化层中单个内核体的大小和形状能够与相应的新的变异操作员一起演变。 拟议的框架采用了混合优化战略, 涉及通过在人口环境下的反向适应更新其结构变化。 几个图像分类基准实验表明, 交叉运算器足够强大, 即使父母只接受了每一小的随机数据集的培训, 也能够产生越来越有性能的后代, 从而提供直接的确认, 学到的特征和行为能够成功地从父网络转移到下一代的后代。