Neural networks are a powerful class of non-linear functions. However, their black-box nature makes it difficult to explain their behaviour and certify their safety. Abstraction techniques address this challenge by transforming the neural network into a simpler, over-approximated function. Unfortunately, existing abstraction techniques are slack, which limits their applicability to small local regions of the input domain. In this paper, we propose Global Interval Neural Network Abstractions with Center-Exact Reconstruction (GINNACER). Our novel abstraction technique produces sound over-approximation bounds over the whole input domain while guaranteeing exact reconstructions for any given local input. Our experiments show that GINNACER is several orders of magnitude tighter than state-of-the-art global abstraction techniques, while being competitive with local ones.
翻译:通过局部精确重建实现全局神经网络抽象化
Translated Abstract:
神经网络是一类强大的非线性函数。然而,它们黑盒的特性使得难以解释其行为并保证其安全性。抽象技术通过将神经网络转化为更简单的、过度逼近的函数来解决这一挑战。不幸的是,现有的抽象技术往往存在错误差,从而限制了它们在输入域小的局部区域中的适用性。在本文中,我们提出了全局间隔神经网络抽象化并通过中心精确重建(GINNACER)的方法。我们的新型抽象化技术在整个输入域范围内产生可靠的上逼近界限,同时保证针对任何给定的局部输入进行精确重建。我们的实验结果表明GINNACER比最新的全局抽象化技术提供了更高的准确性,同时与局部技术相比表现竞争力。