We propose a simple scheme for merging two neural networks trained with different starting initialization into a single one with the same size as the original ones. We do this by carefully selecting channels from each input network. Our procedure might be used as a finalization step after one tries multiple starting seeds to avoid an unlucky one. We also show that training two networks and merging them leads to better performance than training a single network for an extended period of time. Availability: https://github.com/fmfi-compbio/neural-network-merging
翻译:我们提出一个简单的计划,将两个经过不同初始化培训的神经网络合并成一个与最初输入网络大小相同的单一神经网络,我们通过仔细选择每个输入网络的渠道来做到这一点。我们的程序可能会在一次尝试多个启动种子以避免不幸种子之后用作最后的一步。我们还表明,培训两个网络和将其合并比培训一个网络在很长一段时间内取得更好的业绩。