Abstract
Board games are fertile grounds for the display of social signals, and they provide insights into psychological indicators in multi-person interactions. In this work, we introduce a new dataset collected from four-player board game sessions, recorded via multiple cameras, and containing over 46 hours of visual material. The new MUMBAI dataset is extensively annotated with emotional moments for all game sessions. Additional data comes from personality and game experience questionnaires. Our four-person setup allows the investigation of non-verbal interactions beyond dyadic settings. We present three benchmarks for expression detection and emotion classification and discuss potential research questions for the analysis of social interactions and group dynamics during board games.
Original language | English |
---|---|
Pages (from-to) | 373–391 |
Number of pages | 19 |
Journal | Journal on Multimodal User Interfaces |
Volume | 15 |
Issue number | 4 |
DOIs | |
Publication status | Published - Dec 2021 |
Keywords
- Affective computing
- Board games
- Facial expression analysis
- Game experience
- Group dynamics
- Multimodal interaction
- Social interactions