Data-driven tools are increasingly used to make consequential decisions. They have begun to advise employers on which job applicants to interview, judges on which defendants to grant bail, lenders on which homeowners to give loans, and more. In such settings, different data-driven rules result in different decisions. The problem is: to every data-driven rule, there are exceptions. While a data-driven rule may be appropriate for some, it may not be appropriate for all. As data-driven decisions become more common, there are cases in which it becomes necessary to protect the individuals who, through no fault of their own, are the data-driven exceptions. At the same time, it is impossible to scrutinize every one of the increasing number of data-driven decisions, begging the question: When and how should data-driven exceptions be protected? In this piece, we argue that individuals have the right to be an exception to a data-driven rule. That is, the presumption should not be that a data-driven rule--even one with high accuracy--is suitable for an arbitrary decision-subject of interest. Rather, a decision-maker should apply the rule only if they have exercised due care and due diligence (relative to the risk of harm) in excluding the possibility that the decision-subject is an exception to the data-driven rule. In some cases, the risk of harm may be so low that only cursory consideration is required. Although applying due care and due diligence is meaningful in human-driven decision contexts, it is unclear what it means for a data-driven rule to do so. We propose that determining whether a data-driven rule is suitable for a given decision-subject requires the consideration of three factors: individualization, uncertainty, and harm. We unpack this right in detail, providing a framework for assessing data-driven rules and describing what it would mean to invoke the right in practice.
翻译:数据驱动工具越来越多地被用于做出相应的决定。 它们已开始向雇主提供咨询, 向哪些工作申请人提供咨询, 哪些法官提供保释, 哪些房主提供贷款, 哪些放款人提供贷款等等。 在这样的背景下, 不同的数据驱动规则导致不同的决定。 问题是: 每个数据驱动规则都存在例外。 虽然数据驱动规则可能适合某些人, 但可能并非适用于所有人。 随着数据驱动决定变得更加常见, 在有些情况下, 有必要保护那些由于自身过失而成为数据驱动例外的个人。 与此同时, 无法仔细审查越来越多的数据驱动决定中的每一项规则, 哪些房主提供保释。 此时, 问题在于: 当数据驱动规则应该受到保护时, 问题是: 个人有权作为数据驱动规则的例外, 也就是说, 假设数据驱动规则即使具有较高的准确性, 也只能用于任意决策。 相反, 决策者应该适用规则, 只有当他们已经对不断增加数据驱动决定的每个规则都进行适当的审查时, 需要谨慎地说明, 并且对数据做出正确的判断时, 才是正确的判断。