Risk assessment algorithms are being adopted by public sector agencies to make high-stakes decisions about human lives. Algorithms model "risk" based on individual client characteristics to identify clients most in need. However, this understanding of risk is primarily based on easily quantifiable risk factors that present an incomplete and biased perspective of clients. We conducted a computational narrative analysis of child-welfare casenotes and draw attention to deeper systemic risk factors that are hard to quantify but directly impact families and street-level decision-making. We found that beyond individual risk factors, the system itself poses a significant amount of risk where parents are over-surveilled by caseworkers and lack agency in decision-making processes. We also problematize the notion of risk as a static construct by highlighting the temporality and mediating effects of different risk, protective, systemic, and procedural factors. Finally, we draw caution against using casenotes in NLP-based systems by unpacking their limitations and biases embedded within them.
翻译:公共部门机构正在采用风险评估算法,对人的生命作出高度决策。根据个别客户的特点,对风险进行模型“风险”以确定最需要的客户。然而,这种对风险的理解主要基于容易量化的风险因素,这些风险因素给客户带来不完全和偏见的视角。我们对儿童福利的个案说明进行了计算式叙述分析,并提请注意更深层次的系统风险因素,这些风险因素难以量化,但直接影响家庭和街头决策。我们发现,除了个别风险因素外,该系统本身也构成很大的风险,即父母被个案工作人员过度保护,在决策过程中缺乏代理。我们还把风险概念作为一个静态构思,通过强调不同风险、保护、系统和程序因素的时间性和中介效应。最后,我们告诫不要在基于国家劳工规划的系统中使用案例说明,解开其内在的局限性和偏见。我们告诫不要在基于国家劳工规划的系统中使用案例说明。