The explainability of Graph Neural Networks (GNNs) is critical to various GNN applications but remains an open challenge. A convincing explanation should be both necessary and sufficient simultaneously. However, existing GNN explaining approaches focus on only one of the two aspects, necessity or sufficiency, or a heuristic trade-off between the two. Theoretically, the Probability of Necessity and Sufficiency (PNS) can be applied to search for the most necessary and sufficient explanation since it can mathematically quantify the necessity and sufficiency of an explanation. Nevertheless, the difficulty of obtaining PNS due to non-monotonicity and the challenge of counterfactual estimation limit its wide use. To address the non-identifiability of PNS, we resort to a lower bound of PNS that can be optimized via counterfactual estimation, and propose Necessary and Sufficient Explanation for GNN (NSEG) via optimizing that lower bound. Specifically, we employ nearest neighbor matching to generate counterfactual samples and leverage continuous masks with a sampling strategy to optimize the lower bound. Empirical study shows that NSEG achieves excellent performance in generating the most necessary and sufficient explanations among a series of state-of-the-art methods.
翻译:图形神经网络(GNN)的可解释性对于各种GNN应用来说至关重要,但仍然是一个公开的挑战。令人信服的解释应该既必要,同时又足够充分。然而,现有的GNN解释方法只注重两个方面中的一个方面,即必要性或充足性,或二者之间的超值权衡。理论上,可以应用“需要和充足性”来搜索最必要和最充分的解释,因为它可以用数学来量化解释的必要性和充分性。然而,由于非高度性和反事实估计的挑战,很难获得PNS,限制了其广泛使用。为了解决PNS的不可识别性,我们采用较低的PNS约束性,通过反事实估计可以优化PNS的制约性,并为GNN(NSG)提出必要和充分的解释建议。具体地说,我们使用最接近的邻居匹配来生成相反的样品,并以抽样战略来利用连续的遮罩来优化较低约束性。“光学”研究表明,NSEGEG在产生最必要和最充分的一系列状态解释方面表现出色。