The issue of fairness in AI has received an increasing amount of attention in recent years. The problem can be approached by looking at different protected attributes (e.g., ethnicity, gender, etc) independently, but fairness for individual protected attributes does not imply intersectional fairness. In this work, we frame the problem of intersectional fairness within a geometrical setting. We project our data onto a hypercube, and split the analysis of fairness by levels, where each level encodes the number of protected attributes we are intersecting over. We prove mathematically that, while fairness does not propagate "down" the levels, it does propagate "up" the levels. This means that ensuring fairness for all subgroups at the lowest intersectional level (e.g., black women, white women, black men and white men), will necessarily result in fairness for all the above levels, including each of the protected attributes (e.g., ethnicity and gender) taken independently. We also derive a formula describing the variance of the set of estimated success rates on each level, under the assumption of perfect fairness. Using this theoretical finding as a benchmark, we define a family of metrics which capture overall intersectional bias. Finally, we propose that fairness can be metaphorically thought of as a "fractal" problem. In fractals, patterns at the smallest scale repeat at a larger scale. We see from this example that tackling the problem at the lowest possible level, in a bottom-up manner, leads to the natural emergence of fair AI. We suggest that trustworthiness is necessarily an emergent, fractal and relational property of the AI system.
翻译:近些年来,大赦国际的公平问题得到了越来越多的关注。 这个问题可以通过独立地审视不同的受保护属性(如族裔、性别等)来应对,但个人受保护属性的公平并不意味着交叉公平。 在这项工作中,我们将交叉公平问题置于几何环境之内。 我们将我们的数据投放到超立方上,并将对公平的分析分为不同层次,其中每个级别将我们相互交叉的受保护属性的数目编码。 我们从数学上证明,虽然公平不“向下”传播,但它确实“向上”传播。 这意味着确保所有分组在最低交叉层面(如黑人妇女、白人妇女、黑人男子和白人男子)的公平性,必然导致所有以上层面的公平性。 我们将我们的数据投放在一个公式上,描述每个级别估计成功率的差别,假设是完全公平性。 我们用这个理论发现来作为基准,我们定义一个在最低交叉层面的衡量体系的公平性, 最终我们从一个最小的衡量度比例的角度, 将一个反映整个相互交叉性层次的相对性比例。</s>