In 1996, philosopher Helen Nissenbaum issued a clarion call concerning the erosion of accountability in society due to the ubiquitous delegation of consequential functions to computerized systems. Using the conceptual framing of moral blame, Nissenbaum described four types of barriers to accountability that computerization presented: 1) "many hands," the problem of attributing moral responsibility for outcomes caused by many moral actors; 2) "bugs," a way software developers might shrug off responsibility by suggesting software errors are unavoidable; 3) "computer as scapegoat," shifting blame to computer systems as if they were moral actors; and 4) "ownership without liability," a free pass to the tech industry to deny responsibility for the software they produce. We revisit these four barriers in relation to the recent ascendance of data-driven algorithmic systems--technology often folded under the heading of machine learning (ML) or artificial intelligence (AI)--to uncover the new challenges for accountability that these systems present. We then look ahead to how one might construct and justify a moral, relational framework for holding responsible parties accountable, and argue that the FAccT community is uniquely well-positioned to develop such a framework to weaken the four barriers.
翻译:1996年,哲学家Helen Nissenbaum就计算机化系统普遍下放相应功能导致社会问责制受到侵蚀的问题发出了一个警示。利用道德指责的概念框架,Nissenbaum描述了计算机化带来的四类问责制障碍:(1) 将许多道德行为者造成的结果归咎于道德责任的问题“许多手 ” ;(2) “错误 ”,这是软件开发者通过暗示软件错误不可避免而可能逃避责任的方法;(3) “计算机是替罪羊”,将责任转嫁给计算机系统,就像计算机系统是道德行为者一样;(4) “拥有无赔偿责任的”,是技术行业的一条自由通道,用来拒绝对计算机化产生的软件承担责任。我们重新审视了最近数据驱动的算法系统技术在机械学习(ML)或人工智能(AI)标题下不断上升的这四种障碍,以发现这些系统所存在的问责制的新挑战。我们接着审视如何构建和证明一个道德的、关联的框架,以追究各方责任为借口,并论证FACCT社区在开发这样一个框架以削弱四个障碍方面的独特位置。