Many efforts have been made for revealing the decision-making process of black-box learning machines such as deep neural networks, resulting in useful local and global explanation methods. For local explanation, stochasticity is known to help: a simple method, called SmoothGrad, has improved the visual quality of gradient-based attribution by adding noise to the input space and averaging the explanations of the noisy inputs. In this paper, we extend this idea and propose NoiseGrad that enhances both local and global explanation methods. Specifically, NoiseGrad introduces stochasticity in the weight parameter space, such that the decision boundary is perturbed. NoiseGrad is expected to enhance the local explanation, similarly to SmoothGrad, due to the dual relationship between the input perturbation and the decision boundary perturbation. We evaluate NoiseGrad and its fusion with SmoothGrad -- FusionGrad -- qualitatively and quantitatively with several evaluation criteria, and show that our novel approach significantly outperforms the baseline methods. Both NoiseGrad and FusionGrad are method-agnostic and as handy as SmoothGrad using a simple heuristic for the choice of the hyperparameter setting without the need of finetuning.
翻译:为披露黑盒子学习机器(如深神经网络)的决策进程做出了许多努力,例如深神经网络,从而产生了有用的当地和全球解释方法。为了当地的解释,已知的随机性很有帮助:一个叫做SlipleGrad的简单方法,通过在输入空间中添加噪音并平均地解释噪音,提高了基于梯度的属性的视觉质量。在本文件中,我们扩展了这个想法,并提出了加强本地和全球解释方法的NiseGrad。具体地说,NoiseGrad在重力参数空间中引入了随机性,例如决定边界已经过宽。NoiseGrad预计将加强当地的解释,类似“平滑格”法,因为输入的扰动与决定边界的扰动之间存在双重关系。我们用若干评价标准评估NoiseGrad及其与“光栅”-FusionGrad -- -- 质量和数量上的融合,并表明我们的新方法大大超出了基准方法。NoiseGrad和FusionGrad都是方法的方法和方法,因此决定边界是接近的,而且作为平滑度的精度,不需要使用简单的 heurdicrodrodrodu hematicistical 来精确的调整。