Values? Camera? Action! An ethnography of an AI camera system used by the Netherlands Police

I. C. Donatz-Fest*

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

Police departments around the world implement algorithmic systems to enhance various policing tasks. Ensuring such innovations take place responsibly–with public values upheld–is essential for public organisations. This paper analyses how public values are safeguarded in the case of MONOcam, an algorithmic camera system designed and used by the Netherlands police. The system employs artificial intelligence to detect whether car drivers are holding a mobile device. MONOcam can be considered a good example of value-sensitive design; many measures were taken to safeguard public values in this algorithmic system. In pursuit of responsible implementation of algorithms, most calls and literature focus on such value-sensitive design. Less attention is paid to what happens beyond design. Building on 120+ hours of ethnographic observations as well as informal conversations and three semi-structured interviews, this research shows that public values deemed safeguarded in design are re-negotiated as the system is implemented and used in practice. These findings led to direct impact, as MONOcam was improved in response. This paper thus highlights that algorithmic system design is often based on an ideal world, but it is in the complexities and fuzzy realities of everyday professional routines and sociomaterial reality that these systems are enacted, and public values are renegotiated in the use of algorithms. While value-sensitive design is important, this paper shows that it offers no guarantees for safeguarding public values in practice.

Original languageEnglish
Pages (from-to)50-67
Number of pages18
JournalPolicing and Society
Volume35
Issue number1
Early online date2 Jul 2024
DOIs
Publication statusPublished - 2025

Bibliographical note

Publisher Copyright:
© 2024 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.

Funding

This work was supported by Nederlandse Organisatie voor Wetenschappelijk Onderzoek [grant number NWO: 406.DI.19.011 (ALGOPOL)].

FundersFunder number
Nederlandse Organisatie voor Wetenschappelijk OnderzoekNWO: 406.DI.19.011

    Keywords

    • AI
    • algorithm
    • algorithmic policing
    • camera
    • police
    • public values

    Fingerprint

    Dive into the research topics of 'Values? Camera? Action! An ethnography of an AI camera system used by the Netherlands Police'. Together they form a unique fingerprint.

    Cite this