TY - GEN
T1 - Perceived Algorithmic Fairness using Organizational Justice Theory: An Empirical Case Study on Algorithmic Hiring
AU - Juijn, Guusje
AU - Stoimenova, Niya
AU - Reis, João
AU - Nguyen, Dong
N1 - Funding Information:
We want to thank the anonymous reviewers for their useful feedback. Moreover, we thank Maartje de Graaf, Rosanna Nagtegaal, Sieuwert van Otterloo, Goya van Boven, Michael Pieke, Daan van der Weijden, and Tim Koornstra for their advice on this project.
Publisher Copyright:
© 2023 ACM.
PY - 2023/8/8
Y1 - 2023/8/8
N2 - Growing concerns about the fairness of algorithmic decision-making systems have prompted a proliferation of mathematical formulations aimed at remedying algorithmic bias. Yet, integrating mathematical fairness alone into algorithms is insufficient to ensure their acceptance, trust, and support by humans. It is also essential to understand what humans perceive as fair. In this study, we, therefore, conduct an empirical user study into crowdworkers' algorithmic fairness perceptions, focusing on algorithmic hiring. We build on perspectives from organizational justice theory, which categorizes fairness into distributive, procedural, and interactional components. By doing so, we find that algorithmic fairness perceptions are higher when crowdworkers are provided not only with information about the algorithmic outcome but also about the decision-making process. Remarkably, we observe this effect even when the decision-making process can be considered unfair, when gender, a sensitive attribute, is used as a main feature. By showing realistic trade-offs between fairness criteria, we moreover find a preference for equalizing false negatives over equalizing selection rates amongst groups. Our findings highlight the importance of considering all components of algorithmic fairness, rather than solely treating it as an outcome distribution problem. Importantly, our study contributes to the literature on the connection between mathematical– and perceived algorithmic fairness, and highlights the potential benefits of leveraging organizational justice theory to enhance the evaluation of perceived algorithmic fairness.
AB - Growing concerns about the fairness of algorithmic decision-making systems have prompted a proliferation of mathematical formulations aimed at remedying algorithmic bias. Yet, integrating mathematical fairness alone into algorithms is insufficient to ensure their acceptance, trust, and support by humans. It is also essential to understand what humans perceive as fair. In this study, we, therefore, conduct an empirical user study into crowdworkers' algorithmic fairness perceptions, focusing on algorithmic hiring. We build on perspectives from organizational justice theory, which categorizes fairness into distributive, procedural, and interactional components. By doing so, we find that algorithmic fairness perceptions are higher when crowdworkers are provided not only with information about the algorithmic outcome but also about the decision-making process. Remarkably, we observe this effect even when the decision-making process can be considered unfair, when gender, a sensitive attribute, is used as a main feature. By showing realistic trade-offs between fairness criteria, we moreover find a preference for equalizing false negatives over equalizing selection rates amongst groups. Our findings highlight the importance of considering all components of algorithmic fairness, rather than solely treating it as an outcome distribution problem. Importantly, our study contributes to the literature on the connection between mathematical– and perceived algorithmic fairness, and highlights the potential benefits of leveraging organizational justice theory to enhance the evaluation of perceived algorithmic fairness.
KW - algorithmic decision-making
KW - algorithmic hiring
KW - organizational justice
KW - perceived fairness
UR - http://www.scopus.com/inward/record.url?scp=85173606739&partnerID=8YFLogxK
U2 - 10.1145/3600211.3604677
DO - 10.1145/3600211.3604677
M3 - Conference contribution
SN - 979-8-4007-0231-0
SP - 775
EP - 785
BT - AIES 2023 - Proceedings of the 2023 AAAI/ACM Conference on AI, Ethics, and Society
PB - Association for Computing Machinery
ER -