Automatic detection of gaze convergence in multimodal collaboration: a dual eye-tracking technology

A.Y. Shvarts, Andrey Stepanov, Dmitry Chumachenko

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

The paper analyses the advantages and limitations of the current technical solutions for dual eye-tracking (DUET) in relation to the research questions from educational science about joint attention in a multimodal teaching/learning collaboration. The insufficiency of the current systems for the analysis of multimodal collaboration is stated as the reviewed systems do not allow researchers to relate a participant’s eye movements to the video from their joint performance and accompanying gestures without time consuming manual coding. We describe a system of two low-cost Pupil-Labs eyetrackers and propose an open source utility DUET for Pupil that automatically produces synchronized gaze data in the shared system of coordinates. The data are available in the form of a video from the surface that is overlaid by gaze paths with supplementary sound waveforms and as textual data with synchronized coordinates of the two gazes. Our empirical evaluation of this technological solution reports 1.27 ° of visual angle as the spatial accuracy of the system after post-hoc calibration. The advantages, limitations, and further possible enhancements of the system are discussed.
Original languageEnglish
Pages (from-to)4-17
JournalRossijskij žurnal kognitivnoj nauki
Volume5
Issue number3
Publication statusPublished - 2018

Keywords

  • collaboration
  • dual eye-tracking (DUET
  • gesture
  • joint attention
  • multimodal communication
  • head-mounted eyetracker
  • remote eye-tracker
  • Pupil-Labs

Fingerprint

Dive into the research topics of 'Automatic detection of gaze convergence in multimodal collaboration: a dual eye-tracking technology'. Together they form a unique fingerprint.

Cite this