Algorithmic discrimination, the role of GPS, and the limited scope of EU non-discrimination law

Elena Gramano, Miriam Kullmann

Research output: Chapter in Book/Report/Conference proceedingChapterAcademicpeer-review

Abstract

This chapter investigates the potential discriminatory outcome of algorithmic decision-making and the effectiveness and suitability of the current legal framework in preventing and sanctioning all discrimination perpetrated through algorithms. We build a research agenda by drawing on the concrete implications and issues that stem from the abovementioned cases, which happen to be the first court decisions on this matter. Adopting a practical approach by analysing how the two courts use the existing legal sources could, we believe, reframe the debate and call attention to this matter's most problematic aspects. Our focus and thus our core argument is that algorithms can bring about direct, or more often indirect, discrimination. In doing so, we will assess the decision-making process on the basis of neutral factors, which does not attribute any significance to workers' personal aspects or conditions, at least in theory. Moreover, GPS (Global Positioning System), as an instrument used by businesses, including platforms, to gather data unrelated or not directly relating to work performance, can be used to discriminate or to produce discriminatory effects. In doing so, we focus on how GPS is being repurposed by platform companies that offer on-location services and to what extent GPS can be covered by EU non-discrimination law. It will become increasingly clear that, even if GPS were brought within the scope of one or more of the protected grounds, as the data collected by GPS can be regarded as 'proxy discrimination' for race or ethnic origin, age, or even gender, we face a second challenge, i.e., the substantive scope of most EU non-discrimination laws. This shows the perils of the gig economy can be: workers, as defined by EU case law, are protected by EU law against discrimination; the self-employed, or contractors, as platform companies often classify their workforce, are not or not entirely. The chapter ends with a discussion rooted in the premise that algorithms do not differentiate between different contractual underpinnings and thus are 'blind' to some extent: we discuss whether EU non-discrimination law needs broadening to protect a larger group of platform workers, especially when distinguishing between self-employed and employee platform workers is not possible on the face of it.
Original languageEnglish
Title of host publicationA Research Agenda for the Gig Economy and Society
PublisherEdward Elgar Publishing
ISBN (Electronic)9781800883512
ISBN (Print)9781800883505
DOIs
Publication statusPublished - 15 Nov 2022
Externally publishedYes

Fingerprint

Dive into the research topics of 'Algorithmic discrimination, the role of GPS, and the limited scope of EU non-discrimination law'. Together they form a unique fingerprint.

Cite this