Current research on Explainable AI (XAI) heavily targets on expert users (data scientists or AI developers). However, increasing importance has been argued for making AI more understandable to nonexperts, who are expected to leverage AI techniques, but have limited knowledge about AI. We present a mobile application to support nonexperts to interactively make sense of Convolutional Neural Networks (CNN); it allows users to play with a pretrained CNN by taking pictures of their surrounding objects. We use an up-to-date XAI technique (Class Activation Map) to intuitively visualize the model's decision (the most important image regions that lead to a certain result). Deployed in a university course, this playful learning tool was found to support design students to gain vivid understandings about the capabilities and limitations of pretrained CNNs in real-world environments. Concrete examples of students' playful explorations are reported to characterize their sensemaking processes reflecting different depths of thought.
翻译:目前关于可解释的AI(XAI)的研究大量针对专家用户(数据科学家或AI开发者),然而,越来越重要的是让非专家更易理解AI,他们预期会利用AI技术,但对AI的了解有限。 我们提出了一个移动应用程序,支持非专家互动地理解进化神经网络(CNN);它允许用户通过拍摄周围物体的照片与受过预先训练的CNN玩游戏。我们使用最新的XAI技术(Class Acivation Map)直观地将模型的决定(最重要的图像区域,导致某种结果)。在大学课程中,发现这一玩耍式学习工具支持设计学生对经过训练的CNN在现实世界环境中的能力和局限性有生动的了解。据报告,学生的玩耍探索的具体例子可以描述他们感知过程,反映不同的思维深度。