The fast execution speed and energy efficiency of analog hardware has made them a strong contender for deployment of deep learning model at the edge. However, there are concerns about the presence of analog noise which causes changes to the weight of the models, leading to performance degradation of deep learning model, despite their inherent noise resistant characteristics. The effect of the popular batch normalization layer on the noise resistant ability of deep learning model is investigated in this work. This systematic study has been carried out by first training different models with and without batch normalization layer on CIFAR10 and CIFAR100 dataset. The weights of the resulting models are then injected with analog noise and the performance of the models on the test dataset is obtained and compared. The results show that the presence of batch normalization layer negatively impacts noise resistant property of deep learning model and the impact grows with the increase of the number of batch normalization layers.
翻译:模拟硬件的快速执行速度和能源效率使模拟硬件成为在边缘部署深层学习模型的强大竞争者,然而,令人关切的是,模拟噪音造成模型重量的变化,导致深层学习模型的性能退化,尽管这些模型具有固有的耐噪音特性;在这项工作中调查了流行的批量正常化层对耐噪音的深层学习模型能力的影响;通过在CIFAR10和CIFAR100数据集中首次对不同模型进行有和没有批量正常化层的不同模型的培训,进行了这一系统研究;随后,产生的模型的重量被注入了模拟噪音,并获得和比较了测试数据集模型的性能;结果显示,批量正常化层的存在对深层学习模型的耐噪音特性产生了负面影响,随着分批正常化层数量的增加,影响也随之增加。