Deep neural networks (DNNs) demonstrate great success in classification tasks. However, they act as black boxes and we don't know how they make decisions in a particular classification task. To this end, we propose to distill the knowledge from a DNN into a fuzzy inference system (FIS), which is Takagi-Sugeno-Kang (TSK)-type in this paper. The model has the capability to express the knowledge acquired by a DNN based on fuzzy rules, thus explaining a particular decision much easier. Knowledge distillation (KD) is applied to create a TSK-type FIS that generalizes better than one directly from the training data, which is guaranteed through experiments in this paper. To further improve the performances, we modify the baseline method of KD and obtain good results.
翻译:深神经网络(DNN) 显示在分类任务中取得了巨大成功。 但是, 它们作为黑盒, 我们不知道它们如何在特定的分类任务中做出决策。 为此, 我们提议将 DNN 的知识提炼成一个模糊的推断系统( FIS ), 即本文中的 Takagi- Sugeno- Kang (TSK) 类型 。 该模型能够表达 DN 获得的知识, 以模糊的规则为基础, 从而更容易解释一个特定的决定 。 知识蒸馏( KD) 用于创建一种TSK 型的FIS, 它比直接从培训数据中概括起来的更好, 而培训数据通过本文中的实验得到了保障。 为了进一步提高性能, 我们修改了 KD 的基准方法, 并获得良好的结果 。