Many efforts have been made for revealing the decision-making process of black-box learning machines such as deep neural networks, resulting in useful local and global explanation methods. For local explanation, stochasticity is known to help: a simple method, called SmoothGrad, has improved the visual quality of gradient-based attribution by adding noise in the input space and taking the average over the noise. In this paper, we extend this idea and propose NoiseGrad that enhances both local and global explanation methods. Specifically, NoiseGrad introduces stochasticity in the weight parameter space, such that the decision boundary is perturbed. NoiseGrad is expected to enhance the local explanation, similarly to SmoothGrad, due to the dual relationship between the input perturbation and the decision boundary perturbation. Furthermore, NoiseGrad can be used to enhance global explanations. We evaluate NoiseGrad and its fusion with SmoothGrad -- FusionGrad -- qualitatively and quantitatively with several evaluation criteria, and show that our novel approach significantly outperforms the baseline methods. Both NoiseGrad and FusionGrad are method-agnostic and as handy as SmoothGrad using simple heuristics for the choice of hyperparameter setting without the need of fine-tuning.
翻译:为披露黑盒子学习机器(如深神经网络)的决策进程做出了许多努力,例如深神经网络,从而产生了有用的当地和全球解释方法。为了当地的解释,已知的随机性很有帮助:一个叫做SlipleGrad的简单方法,通过在输入空间中添加噪音和对噪音采用平均值,提高了基于梯度归属的视觉质量。在本文件中,我们扩展了这个想法,并提出了加强当地和全球解释方法的NiseGrad。具体地说,NoiseGrad在重力参数空间中引入了随机性,例如决定边界被渗透。NoiseGrad预计将加强当地的解释,类似“平滑格”:由于输入扰动和决定边界扰动之间的双重关系,一个叫做“平滑格”的方法提高了基于梯度归属的视觉质量。此外,NoiseGrad可以用来加强全球解释。我们用若干评估标准来评估NoiseGrad及其在质量上和数量上与平滑格-Furgraved-质量和数量上结合,并表明我们的新方法大大超出了基线方法。Noisegrad和FisgraveardGradgradugrad 都使用简单的方法来进行高压的调整。