Abstract
The paper analyses the advantages and limitations of the current technical solutions for dual eye-tracking (DUET) in relation to the research questions from educational science about joint attention in a multimodal teaching/learning collaboration. The insufficiency of the current systems for the analysis of multimodal collaboration is stated as the reviewed systems do not allow researchers to relate a participant’s eye movements to the video from their joint performance and accompanying gestures without time consuming manual coding. We describe a system of two low-cost Pupil-Labs eyetrackers and propose an open source utility DUET for Pupil that automatically produces synchronized gaze data in the shared system of coordinates. The data are available in the form of a video from the surface that is overlaid by gaze paths with supplementary sound waveforms and as textual data with synchronized coordinates of the two gazes. Our empirical evaluation of this technological solution reports 1.27 ° of visual angle as the spatial accuracy of the system after post-hoc calibration. The advantages, limitations, and further possible enhancements of the system are discussed.
Original language | English |
---|---|
Pages (from-to) | 4-17 |
Journal | Rossijskij žurnal kognitivnoj nauki |
Volume | 5 |
Issue number | 3 |
Publication status | Published - 2018 |
Keywords
- collaboration
- dual eye-tracking (DUET
- gesture
- joint attention
- multimodal communication
- head-mounted eyetracker
- remote eye-tracker
- Pupil-Labs