We present MorphNet, an approach to automate the design of neural network structures. MorphNet iteratively shrinks and expands a network, shrinking via a resource-weighted sparsifying regularizer on activations and expanding via a uniform multiplicative factor on all layers. In contrast to previous approaches, our method is scalable to large networks, adaptable to specific resource constraints (e.g. the number of floating-point operations per inference), and capable of increasing the network's performance. When applied to standard network architectures on a wide variety of datasets, our approach discovers novel structures in each domain, obtaining higher performance while respecting the resource constraint.
翻译:我们提出了MorphNet,这是神经网络结构设计自动化的一种方法。MorphNet反复缩小和扩展网络,通过资源加权的电磁增强器对激活的正常操作进行压缩,并通过所有层面的统一倍增因素扩大网络。与以往的方法不同,我们的方法可以扩缩到大型网络,适应特定资源限制(例如,浮点操作/推理的数量),并且能够提高网络的性能。当应用到各种数据集的标准网络结构时,我们的方法发现每个领域的新结构,在尊重资源限制的同时获得更高的性能。