Learning in uncertain, noisy, or adversarial environments is a challenging task for deep neural networks (DNNs). We propose a new theoretically grounded and efficient approach for robust learning that builds upon Bayesian estimation and Variational Inference. We formulate the problem of density propagation through layers of a DNN and solve it using an Ensemble Density Propagation (EnDP) scheme. The EnDP approach allows us to propagate moments of the variational probability distribution across the layers of a Bayesian DNN, enabling the estimation of the mean and covariance of the predictive distribution at the output of the model. Our experiments using MNIST and CIFAR-10 datasets show a significant improvement in the robustness of the trained models to random noise and adversarial attacks.
翻译:在不确定、吵闹或敌对环境中学习是深层神经网络(DNN)的一项艰巨任务。我们建议采用新的基于理论的高效方法,在Bayesian估计和不同推论的基础上进行强有力的学习。我们通过DNN的层层来制定密度传播问题,并使用一个组合密度推进(EnDP)计划来解决。EnDP方法使我们能够在Bayesian DNN的层层之间传播变化概率分布的瞬间,从而能够估计模型输出的预测分布的平均值和共变情况。我们利用MNIST和CIFAR-10数据集进行的实验表明,经过训练的模型在随机噪音和对抗性攻击方面的坚固性有了显著改善。