As the public seeks greater accountability and transparency from machine learning algorithms, the research literature on methods to explain algorithms and their outputs has rapidly expanded. Feature importance methods form a popular class of explanation methods. In this paper, we apply the lens of feminist epistemology to recent feature importance research. We investigate what epistemic values are implicitly embedded in feature importance methods and how or whether they are in conflict with feminist epistemology. We offer some suggestions on how to conduct research on explanations that respects feminist epistemic values, taking into account the importance of social context, the epistemic privileges of subjugated knowers, and adopting more interactional ways of knowing.
翻译:由于公众从机器学习算法寻求更大的问责制和透明度,关于解释算法及其产出的方法的研究文献迅速扩大,具有特别重要性的方法形成了一种受欢迎的解释方法。在本文中,我们将女权主义认知学的透镜应用到最近具有特别重要性的研究中。我们调查在特征重要性的方法中隐含了哪些认知价值观,以及这些价值观如何或是否与女权主义认知学相冲突。我们就如何进行尊重女权主义认知价值观的解释的研究提出了一些建议,同时考虑到社会环境的重要性、基层知识者的认知学特权,并采取更多的互动的认知方法。