Artificial intelligence (AI) is becoming increasingly complex, making it difficult for users to understand how the AI has derived its prediction. Using explainable AI (XAI)-methods, researchers aim to explain AI decisions to users. So far, XAI-based explanations pursue a technology-focused approach - neglecting the influence of users' cognitive abilities and differences in information processing on the understanding of explanations. Hence, this study takes a human-centered perspective and incorporates insights from cognitive psychology. In particular, we draw on the psychological construct of cognitive styles that describe humans' characteristic modes of processing information. Applying a between-subject experiment design, we investigate how users' rational and intuitive cognitive styles affect their objective and subjective understanding of different types of explanations provided by an AI. Initial results indicate substantial differences in users' understanding depending on their cognitive style. We expect to contribute to a more nuanced view of the interrelation of human factors and XAI design.
翻译:人工智能(AI)日益复杂,使用户难以了解AI是如何得出其预测的。使用可解释的AI(XAI)方法,研究人员力求向用户解释AI的决定。到目前为止,基于XAI的解释采用了以技术为重点的方法,忽视了用户认知能力的影响以及信息处理在理解解释方面的差异。因此,本研究报告采用了以人为本的观点,纳入了认知心理学的洞察力。特别是,我们借鉴了描述人类处理信息特征模式的认知风格的心理结构。我们应用了两个主体之间的实验设计,我们调查了用户理性和直观的认知风格如何影响他们对AI所提供的不同类型解释的客观和主观理解。初步结果显示,根据用户的认知风格,用户的理解存在重大差异。我们期望对人的因素与XAI设计之间的相互关系有更细微的视角。