We propose a general method for growing neural network with shared parameter by matching trained network to new input. By leveraging Hoeffding's inequality, we provide a theoretical base for improving performance by adding subnetwork to existing network. With the theoretical base of adding new subnetwork, we implement a matching method to apply trained subnetwork of existing network to new input. Our method has shown the ability to improve performance with higher parameter efficiency. It can also be applied to trans-task case and realize transfer learning by changing the combination of subnetworks without training on new task.
翻译:我们提出一种通用方法,通过将经过培训的网络与新输入匹配来增加具有共享参数的神经网络。 通过利用Hoffding的不平等性,我们提供了一个理论基础,通过在现有网络中添加子网络来改进性能。有了新增子网络的理论基础,我们运用一种匹配方法,将经过培训的现有网络子网络应用到新的输入中。我们的方法表明有能力以更高的参数效率提高性能。它也可以应用到跨任务案例,并通过改变子网络的组合而实现转移学习,而无需接受新任务培训。