Abstract
Mobile head-worn eye trackers allow researchers to record eye-movement data as participants freely move around and interact with their surroundings. However, participant behavior may cause the eye tracker to slip on the participant’s head, potentially strongly affecting data quality. To investigate how this eye-tracker slippage affects data quality, we designed experiments in which participants mimic behaviors that can cause a mobile eye tracker to move. Specifically, we investigated data quality when participants speak, make facial expressions, and move the eye tracker. Four head-worn eye-tracking setups were used: (i) Tobii Pro Glasses 2 in 50 Hz mode, (ii) SMI Eye Tracking Glasses 2.0 60 Hz, (iii) Pupil-Labs’ Pupil in 3D mode, and (iv) Pupil-Labs’ Pupil with the Grip gaze estimation algorithm as implemented in the EyeRecToo software. Our results show that whereas gaze estimates of the Tobii and Grip remained stable when the eye tracker moved, the other systems exhibited significant errors (0.8–3.1∘ increase in gaze deviation over baseline) even for the small amounts of glasses movement that occurred during the speech and facial expressions tasks. We conclude that some of the tested eye-tracking setups may not be suitable for investigating gaze behavior when high accuracy is required, such as during face-to-face interaction scenarios. We recommend that users of mobile head-worn eye trackers perform similar tests with their setups to become aware of its characteristics. This will enable researchers to design experiments that are robust to the limitations of their particular eye-tracking setup.
Original language | English |
---|---|
Pages (from-to) | 1140-1160 |
Number of pages | 21 |
Journal | Behavior Research Methods |
Volume | 52 |
Issue number | 3 |
DOIs | |
Publication status | Published - 1 Jun 2020 |
Bibliographical note
Funding Information:We thank Mats Dahl, Simon Gran?r and Roger Johansson for lending us the Pupil-Labs equipment and Henrik Garde and Peter Roslund for expert help with photography. We gratefully acknowledge the Lund University Humanities Lab. We thank two anonymous reviewers for helpful comments on a previous version of this manuscript. The data and materials for the experiment are available at (https://github.com/dcnieho/GlassesTestCodeData), and the experiment was not preregistered. Open access funding provided by Lund University.
Publisher Copyright:
© 2019, The Author(s).
Funding
We thank Mats Dahl, Simon Gran?r and Roger Johansson for lending us the Pupil-Labs equipment and Henrik Garde and Peter Roslund for expert help with photography. We gratefully acknowledge the Lund University Humanities Lab. We thank two anonymous reviewers for helpful comments on a previous version of this manuscript. The data and materials for the experiment are available at (https://github.com/dcnieho/GlassesTestCodeData), and the experiment was not preregistered. Open access funding provided by Lund University.
Keywords
- Data quality
- Eye movements
- Head-mounted eye tracking
- Mobile eye tracking
- Natural behavior
- Wearable eye tracking