gazeMapper: A tool for automated world-based analysis of gaze data from one or multiple wearable eye trackers

Diederick C. Niehorster*, Roy S. Hessels, Marcus Nyström, Jeroen S. Benjamins, Ignace T. C Hooge

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

The problem: wearable eye trackers deliver eye-tracking data on a scene video that is acquired by a camera affixed to the participant’s head. Analyzing and interpreting such head-centered data is difficult and laborious manual work. Automated methods to map eye-tracking data to a world-centered reference frame (e.g., screens and tabletops) are available. These methods usually make use of fiducial markers. However, such mapping methods may be difficult to implement, expensive, and eye tracker-specific. The solution: here we present gazeMapper, an open-source tool for automated mapping and processing of eye-tracking data. gazeMapper can: (1) Transform head-centered data to planes in the world, (2) synchronize recordings from multiple participants, (3) determine data quality measures, e.g., accuracy and precision. gazeMapper comes with a GUI application (Windows, macOS, and Linux) and supports 11 different wearable eye trackers from AdHawk, Meta, Pupil, SeeTrue, SMI, Tobii, and Viewpointsystem. It is also possible to sidestep the GUI and use gazeMapper as a Python library directly.

Original languageEnglish
Article number188
JournalBehavior Research Methods
Volume57
Issue number7
DOIs
Publication statusPublished - Jul 2025

Bibliographical note

Publisher Copyright:
© The Author(s) 2025.

Keywords

  • Data quality
  • Eye movements
  • Eye tracking
  • Gaze
  • Head-fixed reference frame
  • Mobile eye tracking
  • Plane
  • Surface
  • Tool
  • Wearable eye tracking
  • World-fixed reference frame

Fingerprint

Dive into the research topics of 'gazeMapper: A tool for automated world-based analysis of gaze data from one or multiple wearable eye trackers'. Together they form a unique fingerprint.

Cite this