Skip to content

gazealytics/gazealytics-master

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gazealytics: A Unified and Flexible Visual Toolkit for Exploratory and Comparative Gaze Analysis

Introduction

Gazealytics is a sophisticated, web-based visual eye tracking analytics toolkit that features a unified combination of gaze analytics features that support flexible exploratory analysis, along with annotation of areas of interest (AOI), time-window of interest (TWI) and filter options based on multiple criteria to visually analyse eye tracking data across time and space.

Gazealytics features coordinated views unifying spatiotemporal exploration of fixations, saccades and scanpaths for various analytical tasks. It aims to help eye tracking analysts to interactively explore eye tracking data. It allows analysts to visualize their data in various granuarlarity in overview, group, and individual level. Data can be grouped across samples, user-defined AOIs or TWIs to support aggregate or filtered analysis of gaze activity.

It supports a flexible interface for integration into analysts' existing workflow. User-defined samples, AOIs, and TWIs can be imported, where visual metrics results and their coordinated visualizations can be explored on the fly. The interface allows for exporting and restoring snapshot of analysis such as multicriteria parameters, AOIs, metrics, visualizations, and text annotation for post-analysis as well as for reporting purposes.

A live instance can be found at https://www2.visus.uni-stuttgart.de/gazealytics/.

Citation

Chen, K. T., Prouzeau, A., Langmead, J., Whitelock-Jones, R. T., Lawrence, L., Dwyer, T., ... & Goodwin, S. (2023, May). Gazealytics: A Unified and Flexible Visual Toolkit for Exploratory and Comparative Gaze Analysis. In Proceedings of the 2023 Symposium on Eye Tracking Research and Applications (pp. 1-7). Preprint available at arXiv:2303.17202.

Please reference using the reference below:

@inproceedings{chen2023gazealytics, title={Gazealytics: A Unified and Flexible Visual Toolkit for Exploratory and Comparative Gaze Analysis}, author={Chen, Kun-Ting and Prouzeau, Arnaud and Langmead, Joshua and Whitelock-Jones, Ryan T and Lawrence, Lee and Dwyer, Tim and Hurter, Christophe and Weiskopf, Daniel and Goodwin, Sarah}, booktitle={Proceedings of the 2023 Symposium on Eye Tracking Research and Applications}, pages={1--7}, year={2023} }

Used by

Vriend, S. A., Vidyapu, S., Rama, A., Chen, K. T., & Weiskopf, D. (2024, June). Which Experimental Design is Better Suited for VQA Tasks?: Eye Tracking Study on Cognitive Load, Performance, and Gaze Allocations. In Proceedings of the 2024 Symposium on Eye Tracking Research and Applications (pp. 1-7).

Wang, Y., Jiang, Y., Hu, Z., Ruhdorfer, C., Bâce, M., Bulling, A. (2024, June). VisRecall++: Analysing and Predicting Visualisation Recallability from Gaze Behaviour. in Proceedings of the ACM on Human-Computer Interaction (PACM HCI), vol. 8, no. ETRA, Art. 239.

Chen, K. T., Ngo, Q. Q., Kurzhals, K., Marriott, K., Dwyer, T., Sedlmair, M., & Weiskopf, D. (2023, May). Reading Strategies for Graph Visualizations that Wrap Around in Torus Topology. In Proceedings of the 2023 Symposium on Eye Tracking Research and Applications (pp. 1-7).

Pozdniakov, S., Martinez-Maldonado, R., Tsai, Y. S., Echeverria, V., Srivastava, N., & Gasevic, D. (2023, March). How Do Teachers Use Dashboards Enhanced with Data Storytelling Elements According to their Data Visualization Literacy Skills?. In LAK23: 13th International Learning Analytics and Knowledge Conference (pp. 89-99).

Cai, M., Zheng, B., & Demmans Epp, C. (2022, July). Towards Supporting Adaptive Training of Injection Procedures: Detecting Differences in the Visual Attention of Nursing Students and Experts. In Proceedings of the 30th ACM Conference on User Modeling, Adaptation and Personalization (pp. 286-294).

Media

Requirements

  • This repository
  • Python 3.5 or above (web server scripting)

Tutorial

There are some videos for getting started with Gazealytics (whose previous release was named as webVETA).

More details can be found in Gazealytics paper: https://arxiv.org/pdf/2303.17202.pdf

Features

Gazealytics implements a list of features/algorithms:

Visual analysis examples

The examples below are meant to showcase Gazealytics's capabilities as a unified and flexible visual eye tracking analytics toolkit. It is ready to be integrated into users' existing data analysis workflow.

An exploration can begin at any stage of multi-way visual exploration (a-f) and move between them as shown by arrows. An exploration can begin at any stage of multi-way visual exploration (a-f) and move between them as shown by arrows.

Multiple coordinated views of (a) data management panel; (b) spatial panel; (c) parameter control panel; (d) metric panel; (e) timeline panel. The flexible user interface helps users to perform visual analysis across multiple eye tracking analytical tasks. Multiple coordinated views of (a) data management panel; (b) spatial panel; (c) parameter control panel; (d) metric panel; (e) timeline panel. The flexible user interface helps users to perform visual analysis across multiple eye tracking analytical tasks.

Analytical results can be easily exported and integrated into users' own statistical testing pipeline. Analytical results can be easily exported and integrated into users' own statistical testing pipeline.

Visual support of interactive exploration of AOIs. The coordinated views of spatial panel, quantitative visual metrics, and AOI sequence chart helps a user to find a more suitable AOI definition. Visual support of interactive exploration of AOIs. The coordinated views of spatial panel, quantitative visual metrics, and AOI sequence chart helps a user to find a more suitable AOI definition.

Visualizations

The examples below are meant to demonstrate Gazealytics's capability of visual analysis for multiple eye tracking analytical tasks.

Raw gaze sequence visualization/scanpath visualization with fixation filtering

Data inspection before imposing any fixation filtering: Left: raw gaze sequence; Right: scanpath with fixation filtering. By default, Gazealytics implements Poole and Ball's dispersion-based algorithm (IDT). Raw gaze sequence visualization/scanpath visualization with fixation filtering

Coordinated views: Scanpath and timelines

Within-subject scanpath comparison of trials, ordered by task speed. The annotations (a)-(n) and tickes/crosses are done externally using Paint.

Within-subject scanpath comparison of trials, ordered by task speed. The annotations (a)-(n) and tickes/crosses are done externally using Paint

Scanpath by saccade types

Short/long/glance saccade, where short saccade cutoff can be set to a range between 0 and max pixels. For example, it is set to 100 pixels in this this illustration.

Scanpath by saccade types

Scanpath visualization with overlaid density maps

Scanpath can be drawn with density map (implemented in contours with Bell-curve) to gain both gaze sequence and fixation distribution. Scanpath can be drawn with density map (contours)

Saccades are classified into short, long, glance types. Saccade filtering is applied to explore the gaze transitions between AOIs.

Saccades can be classified by short, long, glance types; saccade filtering: between AOIs

Spatiotemporal visualization linked with TWI and video

Example of Gazealytics GUI layout showing data panel (upper left), spatial panel, video panel, timeline panel, and control panel (right).

Spatiotemporal visualizations can be linked with time-window-of-interests (TWI) and videos. When a specific TWI is chosen, the video will be automatically set to corresponding start time of the selected TWI.

spatiotemporal visualization linked with TWI and video

ps: known issue: synchronizing timeline when playing the gaze video is to be fixed

Aggregated saccade visualization/saccade bundling

Interactive annotation of area of interests

Interactive visual aggregation and group level visualizations

An exploration can begin at any stage of multi-way visual exploration (a-d) with interactive visual grouping over samples, AOIs, and TWIs, and move between them (a-d). Multiple coordinated views: Density maps and visual metrics

Data management panel: sample group/AOI group/TWI group

Linking and brushing

between metric panel and spatial/temporal views

between matrix relationships and spatial/termporal views) AOI-AOI transitions

Small multiples (minimaps)

fixation overlap matrix/visual metrics in matrix

fixation overlap matrix/statistics

fixation overlap matrix/statistics

density map matrix

Resizable GUIs

GUI is flexible in adjusting the size of each canvas for analysts' specific analytical needs. GUI is flexible in adjusting the size of each canvas for analysts' specific analytical needs.

Export metrics for post-analysis

samples-AOIs

sample group-AOIs

time window of interest (TWI)-AOIs

Matrix reordering

Demo datasets

Development

To run Gazealytics from its source code simply run the following:

Download and install miniconda

conda install -c conda-forge ujson 
conda install -c conda-forge py 
conda install -c conda-forge numpy
conda install -c conda-forge pyopencl 
conda install -c conda-forge pocl
<the path to python3.7 binary> <the path to server.py>

This starts a server in development mode at http://localhost:8080/. pyopencl is not required unless working with saccade bundling visualizations

Team

The toolkit is developed and maintained by:

  • Kun-Ting Chen (University of Stuttgart)
  • Yao Wang (University of Stuttgart)
  • Sita Vriend (University of Stuttgart)
  • Sarah Goodwin (Monash University)

Past developers:

  • Joshua Langmead (Monash University)
  • Ishwari Bhade (Monash University)
  • Ryan T Whitelock-Jones (Monash University)

Main contributors:

  • Kun-Ting Chen (Centre for Research on Engineering Software Technologies, University of Adelaide)
  • Arnaud Prouzeau (Inria & LaBRI (University of Bordeaux, CNRS, Bordeaux-INP))
  • Joshua Langmead (Monash University)
  • Ryan T Whitelock-Jones (Monash University)
  • Lee Lawrence (Monash University)
  • Tim Dwyer (Monash University)
  • Christophe Hurter (ENAC, UniversitĂ© de Toulouse)
  • Daniel Weiskopf (University of Stuttgart)
  • Sarah Goodwin (Monash University)

License

Gazealytics is provided under the MIT License.