Previous post-processing bias mitigation algorithms on both group and individual fairness don't work on regression models and datasets with multi-class numerical labels. We propose a priority-based post-processing bias mitigation on both group and individual fairness with the notion that similar individuals should get similar outcomes irrespective of socio-economic factors and more the unfairness, more the injustice. We establish this proposition by a case study on tariff allotment in a smart grid. Our novel framework establishes it by using a user segmentation algorithm to capture the consumption strategy better. This process ensures priority-based fair pricing for group and individual facing the maximum injustice. It upholds the notion of fair tariff allotment to the entire population taken into consideration without modifying the in-built process for tariff calculation. We also validate our method and show superior performance to previous work on a real-world dataset in criminal sentencing.
翻译:先前在集团和个人公平问题上的后处理后减少偏见的算法不采用具有多级数字标签的回归模型和数据集。我们建议对集团和个人公平都采取基于优先的后处理后减少偏见的做法,认为类似个人应获得类似的结果,而不论社会经济因素如何,而且更何况不公平,更何况不公正。我们通过对智能电网中的关税分配进行个案研究来确立这一主张。我们的新框架通过使用用户分解算法来更好地捕捉消费战略来确立这一主张。这个过程确保为面临最大不公正的群体和个人提供基于优先的公平定价。它坚持对全体人口公平关税分配的理念,而不改变计算关税的内在程序。我们还验证了我们的方法,并展示了我们以往在刑事判决中真实世界数据集方面的工作的优劣表现。