Abstract
Games are designed to elicit strong emotions during game play, especially when players are competing against each other. Artificial Intelligence applied to predict a player's emotions has mainly been tested on single-player experiences in low-stakes settings and short-term interactions. How do players experience and manifest affect in high-stakes competitions, and which modalities can capture this? This paper reports a first experiment in this line of research, using a competition of the video game Hearthstone where both competing players' game play and facial expressions were recorded over the course of the entire match which could span up to 41 minutes. Using two experts' annotations of tension using a continuous video affect annotation tool, we attempt to predict tension from the webcam footage of the players alone. Treating both the input and the tension output in a relative fashion, our best models reach 66.3% average accuracy (up to 79.2% at the best fold) in the challenging leave-one-participant out cross-validation task. This initial experiment shows a way forward for affect annotation in games "in the wild"in high-stakes, real-world competitive settings.
Original language | English |
---|---|
Title of host publication | Proceedings of the 18th International Conference on the Foundations of Digital Games, FDG 2023 |
Editors | Phil Lopes, Filipe Luz, Antonios Liapis, Henrik Engstrom |
Publisher | Association for Computing Machinery |
ISBN (Electronic) | 9781450398565 |
ISBN (Print) | 9781450398558 |
DOIs | |
Publication status | Published - 12 Apr 2023 |
Event | 18th International Conference on the Foundations of Digital Games, FDG 2023 - Lisbon, Portugal Duration: 11 Apr 2023 → 14 Apr 2023 |
Publication series
Name | ACM International Conference Proceeding Series |
---|
Conference
Conference | 18th International Conference on the Foundations of Digital Games, FDG 2023 |
---|---|
Country/Territory | Portugal |
City | Lisbon |
Period | 11/04/23 → 14/04/23 |
Bibliographical note
Funding Information:The authors would like to thank the Tilburg University Esports Association “Link” for their collaboration. Antonios Liapis and Georgios N. Yannakakis were supported by the European Union’s H2020 research and innovation programme (Grant Agreement No. 951911).
Publisher Copyright:
© 2023 ACM.
Funding
The authors would like to thank the Tilburg University Esports Association “Link” for their collaboration. Antonios Liapis and Georgios N. Yannakakis were supported by the European Union’s H2020 research and innovation programme (Grant Agreement No. 951911).
Keywords
- competitive games
- facial expression analysis
- Player affect
- player modeling