Skip to content

Commit

Permalink
Merge pull request #684 from pupil-labs/aoi_metrics_csv_error
Browse files Browse the repository at this point in the history
fix error in aoi_metrics.csv output
  • Loading branch information
N-M-T authored May 31, 2024
2 parents 16a8d41 + 5ab2ab9 commit b1eea5c
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 5 deletions.
6 changes: 3 additions & 3 deletions neon/data-collection/data-streams/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,11 +18,11 @@ The Neon Companion app can provide gaze data in real-time at up to 200 Hz. Gaze

![Gaze](./gaze.jpg)

The achieved framerate can vary based on what Companion device is used and environmental conditions. On the OnePlus 10, the full 200 Hz can generally be achieved outside of especially hot environments. On the OnePlus 8, the framerate typically drops to ~120 Hz within a few minutes of starting a recording. Other apps running simultaneously on the phone may decrease the framerate.
The achieved framerate can vary based on what Companion device is used and environmental conditions. On the OnePlus 10 and Motorola Edge 40 Pro, the full 200 Hz can generally be achieved outside of especially hot environments. On the OnePlus 8, the framerate typically drops to ~120 Hz within a few minutes of starting a recording. Other apps running simultaneously on the phone may decrease the framerate.

After a recording is uploaded to Pupil Cloud, gaze data is automatically re-computed at the full 200 Hz framerat and can be downloaded from there.
After a recording is uploaded to Pupil Cloud, gaze data is automatically re-computed at the full 200 Hz framerate and can be downloaded from there.

The gaze estimation algorithm is based on end-2-end deep learning and provides gaze data robustly without requiring a calibration. We are currently working on a white paper that thoroughly evaluated the algorithm and will link it here once it is published.
The gaze estimation algorithm is based on end-2-end deep learning and provides gaze data robustly without requiring a calibration. You can find a high-level description as well as a thorough evaluation of the accuracy and robustness of the algorithm in our [white paper](https://zenodo.org/doi/10.5281/zenodo.10420388).

## Fixations & Saccades

Expand Down
3 changes: 1 addition & 2 deletions neon/pupil-cloud/visualizations/areas-of-interest/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,5 @@ This file contains standard fixation and gaze metrics on AOIs.
| **average fixation duration [ms]** | Average fixation duration for the corresponding area of interest in milliseconds. |
| **total fixations** | Total number of fixations for the corresponding area of interest in milliseconds. |
| **time to first fixation [ms]** | Average time in milliseconds until the corresponding area of interest gets fixated on for the first time in a recording. |
| **time to first gaze [ms]** | Average time in milliseconds until the corresponding area of interest gets gazed at for the first time in a recording. |
| **total fixation duration [ms]** | Total fixation duration for the corresponding area of interest in milliseconds. |
| **total gaze duration [ms]** | Total fixation duration for the corresponding area of interest in milliseconds. |

0 comments on commit b1eea5c

Please sign in to comment.