diff --git a/alpha-lab/gaze-metrics-in-aois/data/enrichment_info.txt b/alpha-lab/gaze-metrics-in-aois/data/enrichment_info.txt
index 3ab67f218..44eb745f0 100644
--- a/alpha-lab/gaze-metrics-in-aois/data/enrichment_info.txt
+++ b/alpha-lab/gaze-metrics-in-aois/data/enrichment_info.txt
@@ -1,3 +1,3 @@
reference-image-mapper export
-Find detailed documentation here: https://docs.pupil-labs.com/cloud/enrichments/#reference-image-mapper
+Find detailed documentation here: https://docs.pupil-labs.com/cloud/enrichments/#reference-image-mapper
Version 1
\ No newline at end of file
diff --git a/components/header/NavBarHamburger.vue b/components/header/NavBarHamburger.vue
index c244ec681..0cfb2336f 100644
--- a/components/header/NavBarHamburger.vue
+++ b/components/header/NavBarHamburger.vue
@@ -1,31 +1,31 @@
@@ -47,86 +47,86 @@
diff --git a/core/software/pupil-capture/index.md b/core/software/pupil-capture/index.md
index 42da8d22d..f04cdf0f5 100644
--- a/core/software/pupil-capture/index.md
+++ b/core/software/pupil-capture/index.md
@@ -15,14 +15,14 @@ The World window is the main control center for Pupil Capture. It displays a liv
### Keyboard Shortcuts
-| Keyboard Shortcut | Description |
-|:--------------------|:-------------------------------------------------------------|
-| `r` | Start and stop recording |
-| `c` | Start and stop calibration |
-| `t` | Start and stop validation |
-| `a` | Surface tracker: Add new surface |
-| `x` | Add annotation (default keyboard shortcut) |
-| `i` | Camera intrinsic estimation: Take snapshot of circle pattern |
+| Keyboard Shortcut | Description |
+| :---------------- | :----------------------------------------------------------- |
+| `r` | Start and stop recording |
+| `c` | Start and stop calibration |
+| `t` | Start and stop validation |
+| `a` | Surface tracker: Add new surface |
+| `x` | Add annotation (default keyboard shortcut) |
+| `i` | Camera intrinsic estimation: Take snapshot of circle pattern |
## Video Source Selection
By default Pupil Capture will use Local USB as the capture source.
@@ -64,11 +64,11 @@ sudo usermod -a -G plugdev $USER
Due to new [technical limitations](https://github.com/libusb/libusb/issues/1014), Pupil Capture and Pupil Service need to be started with administrator privileges to get access to the video camera feeds. To do that, copy the applications into your /Applications folder and run the corresponding command from the terminal:
-Pupil Capture:
+Pupil Capture:
``` zsh
sudo /Applications/Pupil\ Capture.app/Contents/MacOS/pupil_capture
```
-Pupil Service:
+Pupil Service:
``` zsh
sudo /Applications/Pupil\ Service.app/Contents/MacOS/pupil_service
```
@@ -135,7 +135,7 @@ Before starting a calibration, ensure that the participant's pupil is robustly d
Your Pupil is properly detected by the camera.
-
+

@@ -189,7 +189,7 @@ Make sure to always use the **v0.4 marker design** for best detection performanc
-[Download Pupil Labs Calibration Marker v0.4](http://docs.pupil-labs.com/core/software/pupil-capture/v0.4_marker.pdf') to print or display on smartphone/tablet screen.
+[Download Pupil Labs Calibration Marker v0.4](./v0.4_marker.pdf) to print or display on smartphone/tablet screen.
#### Single Marker Calibration Choreography
@@ -385,7 +385,7 @@ Apriltags ready to use:
-
+

@@ -462,8 +462,8 @@ For all new projects we strongly recommend using Apriltags!
### Blink Detector
-The online blink detector classifies [blinks](/terminology/#blinks) according to onset and offset thresholds
-associated with [2D pupil confidence](/terminology/#confidence). See the
+The online blink detector classifies [blinks](/terminology/#blinks) according to onset and offset thresholds
+associated with [2D pupil confidence](/terminology/#confidence). See the
[Blink Detector documentation](/software/pupil-player/#blink-detector) for more information.
Read more about accessing blink detection results in real-time in the [developer documentation](/developer/network-api/#blink-messages).
@@ -507,13 +507,13 @@ Specifically, the intrinsics are saved to a file with the name pattern ``1280x720` `1920x1080` | Pupil Core high-speed scene camera |
-| Pupil Cam2 ID0/1 | `192x192` `400x400` | Core headset eye camera; max. 200 Hz sampling rate |
-| Pupil Cam3 ID0/1 | `192x192` `400x400` | HTC Vive add-on eye camera; max. 200 Hz sampling rate |
+| Pupil Cam2 ID0/1 | `192x192` `400x400` | Core headset eye camera; max. 200 Hz sampling rate |
+| Pupil Cam3 ID0/1 | `192x192` `400x400` | HTC Vive add-on eye camera; max. 200 Hz sampling rate |
| Logitech Webcam C930e | `640x480` `1280x720` `1920x1080` | Pupil Core high-definition scene camera (discontinued) |
-| Pupil Cam1 ID0/1 | `320x240` `640x480` | eye camera (discontinued); max. 120 Hz sampling rate |
+| Pupil Cam1 ID0/1 | `320x240` `640x480` | eye camera (discontinued); max. 120 Hz sampling rate |
When a recording is started in Pupil Capture, the application saves the active camera intrinsics to the `world.intrinsics`, `eye0.intrinsics`, and `eye1.intrinsics` files within the recording.
@@ -532,19 +532,19 @@ Based on the estimated intrinsics, one can calculate the camera's field of view
**Field of view in degrees:**
| Camera name | Resolution | Horizontal | Vertical | Diagonal |
-| :--- | :---: | :---: | :---: | :---: |
-| Pupil Cam1 ID2 (default – wide-angle lens) | `1920x1080` | 155° | 85° | --- |
-| | `1280x720` | 103° | 54° | 122° |
-| | `640x480` | 103° | 73° | 134° |
-| Pupil Cam1 ID2 (narrow-angle lens) | `1920x1080` | 88° | 54° | 106° |
-| | `1280x720` | 63° | 37° | 70° |
-| | `640x480` | 42° | 32° | 51° |
-| Pupil Cam2 ID0/1 | `400x400` | 39° | 39° | 53° |
-| | `192x192` | 37° | 37° | 51° |
-| Pupil Cam3 ID0/1 | `400x400` | 71° | 71° | 91° |
-| | `192x192` | 69° | 69° | 88° |
-| Logitech Webcam C930e (discontinued) | `1920x1080` | 82° | 53° | 91° |
-| | `1280x720` | 80° | 51° | 89° |
-| | `640x480` | 64° | 52° | 77° |
-| Pupil Cam1 ID0/1 (discontinued) | `640x480` | 51° | 39° | 62° |
-| | `320x240` | 51° | 39° | 61° |
+| :----------------------------------------- | :---------: | :--------: | :------: | :------: |
+| Pupil Cam1 ID2 (default – wide-angle lens) | `1920x1080` | 155° | 85° | --- |
+| | `1280x720` | 103° | 54° | 122° |
+| | `640x480` | 103° | 73° | 134° |
+| Pupil Cam1 ID2 (narrow-angle lens) | `1920x1080` | 88° | 54° | 106° |
+| | `1280x720` | 63° | 37° | 70° |
+| | `640x480` | 42° | 32° | 51° |
+| Pupil Cam2 ID0/1 | `400x400` | 39° | 39° | 53° |
+| | `192x192` | 37° | 37° | 51° |
+| Pupil Cam3 ID0/1 | `400x400` | 71° | 71° | 91° |
+| | `192x192` | 69° | 69° | 88° |
+| Logitech Webcam C930e (discontinued) | `1920x1080` | 82° | 53° | 91° |
+| | `1280x720` | 80° | 51° | 89° |
+| | `640x480` | 64° | 52° | 77° |
+| Pupil Cam1 ID0/1 (discontinued) | `640x480` | 51° | 39° | 62° |
+| | `320x240` | 51° | 39° | 61° |
diff --git a/invisible/data-collection/data-format/index.md b/invisible/data-collection/data-format/index.md
index 51bd5b2ff..7732ad6a3 100644
--- a/invisible/data-collection/data-format/index.md
+++ b/invisible/data-collection/data-format/index.md
@@ -57,7 +57,7 @@ This file contains the timestamps of every world video frame.
## events.csv
-This file contains [event](https://docs.pupil-labs.com/neon/data-collection/events/) data for all recordings. It contains both project event annotations and real-time recording events.
+This file contains [event](/data-collection/events/) data for all recordings. It contains both project event annotations and real-time recording events.
| Field | Description |
| ------------------ | --------------------------------------------------------- |
diff --git a/invisible/data-collection/lab-streaming-layer/index.md b/invisible/data-collection/lab-streaming-layer/index.md
index 2bf7176d9..e6b7703e7 100644
--- a/invisible/data-collection/lab-streaming-layer/index.md
+++ b/invisible/data-collection/lab-streaming-layer/index.md
@@ -1,2 +1,2 @@
# Lab Streaming Layer
-This page is still a work in progress. To use LSL with Neon you need to use our [LSL Relay](https://github.com/pupil-labs/lsl-relay). More documentation is coming soon!
\ No newline at end of file
+This page is still a work in progress. To use LSL with Pupil Invisible you need to use our [LSL Relay](https://github.com/pupil-labs/lsl-relay). More documentation is coming soon!
\ No newline at end of file
diff --git a/invisible/data-collection/monitor-app/index.md b/invisible/data-collection/monitor-app/index.md
index cee3924f9..044fd32fb 100644
--- a/invisible/data-collection/monitor-app/index.md
+++ b/invisible/data-collection/monitor-app/index.md
@@ -16,7 +16,7 @@ Once the page has loaded, you will be presented with a live stream of the scene
If you have multiple Pupil Invisible devices connected to the network, you can switch between devices using the switch button.
-Using the various event buttons, you can remotely save an [event](https://docs.pupil-labs.com/neon/data-collection/events/) in a recording to mark points of interest. In the settings view you can customize the according event names.
+Using the various event buttons, you can remotely save an [event](/data-collection/events/) in a recording to mark points of interest. In the settings view you can customize the according event names.

diff --git a/invisible/data-collection/recordings/index.md b/invisible/data-collection/recordings/index.md
index ba1be147d..2e8afa69a 100644
--- a/invisible/data-collection/recordings/index.md
+++ b/invisible/data-collection/recordings/index.md
@@ -8,7 +8,7 @@ The Pupil Invisible Companion app has several more features to ensure robust dat
## Wearers, Templates & Events
When making a recording you can capture various additional data to record things like meta data or key events that happened during data collection. These will be saved as part of the recording itself.
-[**Wearers**](/data-collection/wearers/) are the people who wear the Neon device while recording.
+[**Wearers**](/data-collection/wearers/) are the people who wear the Pupil Invisible Glasses while recording.
[**Templates**](/data-collection/templates/) are questionnairs that can be filled out at recording time.
diff --git a/invisible/data-collection/time-synchronization/index.md b/invisible/data-collection/time-synchronization/index.md
index 35ab629d9..d37bd1f67 100644
--- a/invisible/data-collection/time-synchronization/index.md
+++ b/invisible/data-collection/time-synchronization/index.md
@@ -1,5 +1,5 @@
# Achieve super-precise Time Sync
-For some applications, it is critical to accurately synchronize your Pupil Invisible device with another clock. That could be from a second Pupil Invisible device, an external sensor, or a computer you use for stimulus presentation.
+For some applications, it is critical to accurately synchronize your Pupil Invisible device with another clock. That could be from a second Pupil Invisible device, an external sensor, or a computer you use for stimulus presentation.
Pupil Invisible provides UTC timestamps for all the data it generates, which makes it easy to sync the data to anything. Those timestamps are generated using the clock of the Companion device. However, digital clocks can suffer from drift, meaning that they sometimes run slightly too fast or slow. Over time this error accumulates and can lead to errors when comparing two clocks.
@@ -27,7 +27,7 @@ NTP_SERVER=north-america.pool.ntp.org
### Improving Synchronization further
While an error of `<20 ms` is sufficient for most applications, some require even better synchronization. To achieve this, you can estimate the offset between the clock used by Pupil Invisible and the external clock down to single millisecond accuracy.
-This can be done using the `TimeOffsetEstimator` of the [real-time API](https://docs.pupil-labs.com/invisible/real-time-api/introduction/). Using the following code, you can estimate the offset between the Pupil Invisible clock and the clock of the host executing the code.
+This can be done using the `TimeOffsetEstimator` of the [real-time API](/real-time-api/tutorials/). Using the following code, you can estimate the offset between the Pupil Invisible clock and the clock of the host executing the code.
::: tip
**Dependency**: `pip install "pupil-labs-realtime-api>=1.1.0"`
:::
diff --git a/invisible/index.md b/invisible/index.md
index fc3a5985a..48999b568 100644
--- a/invisible/index.md
+++ b/invisible/index.md
@@ -26,7 +26,7 @@ cards:
details: Pupil Cloud is a powerful tool for managing your data, analyzing your recordings, and collaborating with your team. It's our recommended tool for analysis.
link: { text: View, href: "/invisible/pupil-cloud/" }
- title: Real-Time API
- details: You can access the data Neon generates in real-time and remote control it using its real-time API. Learn how it works here!
+ details: You can access the data Pupil Invisible generates in real-time and remote control it using its real-time API. Learn how it works here!
link: { text: View, href: "/invisible/real-time-api/tutorials/" }
- title: Chat
details: Come chat with us on Discord! Do you have an issues or questions? Just wanna say hi? Join the chat and drop us a message!
diff --git a/invisible/pupil-cloud/index.md b/invisible/pupil-cloud/index.md
index 6f8fb53fc..05602f9cb 100644
--- a/invisible/pupil-cloud/index.md
+++ b/invisible/pupil-cloud/index.md
@@ -6,8 +6,8 @@ aside: false
-[Pupil Cloud](https://cloud.pupil-labs.com) is a web-based eye tracking platform for data logistics, analysis, and visualization. It is the recommended tool for processing your Neon recordings. It makes it easy to store all your data securely in one place and it offers a variety of options for analysis.
+[Pupil Cloud](https://cloud.pupil-labs.com) is a web-based eye tracking platform for data logistics, analysis, and visualization. It is the recommended tool for processing your Pupil Invisible recordings. It makes it easy to store all your data securely in one place and it offers a variety of options for analysis.
-If Cloud upload is enabled in the Neon Companion app, then all recordings will be uploaded automatically to Pupil Cloud.
+If Cloud upload is enabled in the Pupil Invisible Companion app, then all recordings will be uploaded automatically to Pupil Cloud.
We have a strict privacy policy that ensures your recording data is accessible only by you and those you explicitly grant access to. Pupil Labs will never access your recording data unless you explicitly instruct us to.
diff --git a/invisible/real-time-api/tutorials/index.md b/invisible/real-time-api/tutorials/index.md
index 23451e467..3eb02b3e6 100644
--- a/invisible/real-time-api/tutorials/index.md
+++ b/invisible/real-time-api/tutorials/index.md
@@ -2,7 +2,7 @@
For some applications it is critical to have access to eye tracking data in real time. Imagine for example an application utilizing gaze interaction to allow users to press a button using only their eyes.
-In other cases it may be important to automatically start or stop a recording and save [events](https://docs.pupil-labs.com/neon/data-collection/events/). For example, you might want to launch a screen-based experiment and have the recording start automatically when the stimulus presentation begins. Additionally, you might want to save the timestamps of when the subject interacted with the screen.
+In other cases it may be important to automatically start or stop a recording and save [events](/data-collection/events/). For example, you might want to launch a screen-based experiment and have the recording start automatically when the stimulus presentation begins. Additionally, you might want to save the timestamps of when the subject interacted with the screen.
All of this is possible for developers using [Pupil Invisible's Realtime Network API](https://github.com/pupil-labs/realtime-network-api). It allows you to stream gaze data and scene video to any device connected to the same local network. Further, you can control all devices remotely to start and stop recordings or save events.
@@ -72,7 +72,7 @@ device.recording_stop_and_save()
Started recording with id 2f99d9f9-f009-4015-97dd-eb253de443b0
-While a recording is running, you can save [events](https://docs.pupil-labs.com/neon/data-collection/events/)
+While a recording is running, you can save [events](/data-collection/events/)
using the [`send_event`](https://pupil-labs-realtime-api.readthedocs.io/en/stable/api/simple.html#pupil_labs.realtime_api.simple.Device.send_event) method.
By default, the Pupil Invisible device receiving the event will assign assign a timestamp to it,
using the time of arrival. Optionally, you can set a custom nanosecond timestamp for your event instead.
diff --git a/invisible/understand-the-ecosystem/index.md b/invisible/understand-the-ecosystem/index.md
index ae515146a..27f5d94c5 100644
--- a/invisible/understand-the-ecosystem/index.md
+++ b/invisible/understand-the-ecosystem/index.md
@@ -1,18 +1,18 @@
# Understand the Ecosystem
-The Neon ecosystem contains a range of tools that support you during data collection and data analysis. This overview introduces all the key components so you can become familiar with all tools at your disposal.
+The Pupil Invisible ecosystem contains a range of tools that support you during data collection and data analysis. This overview introduces all the key components so you can become familiar with all tools at your disposal.
-## Neon Companion app
+## Pupil Invisible Companion app
-You should have already used the Neon Companion app to [make your first recording](/data-collection/first-recording/). This app is the core of every Neon data collection.
+You should have already used the Pupil Invisible Companion app to [make your first recording](/data-collection/first-recording/). This app is the core of every Pupil Invisible data collection.
-When your Neon is connected to the Companion device, it supplies it with power and the app calculates a real-time gaze signal. When making a recording, all generated data is saved on the Companion device.
+When your Pupil Invisible is connected to the Companion device, it supplies it with power and the app calculates a real-time gaze signal. When making a recording, all generated data is saved on the Companion device.
The app automatically saves [UTC timestamps](https://en.wikipedia.org/wiki/Coordinated_Universal_Time) for every generated data sample. This allows you to easily sync your data with other 3rd party sensors, or to sync recordings from multiple subjects that have been made in parallel.
## Pupil Cloud
-Pupil Cloud is our web-based storage and analysis platform located at [cloud.pupil-labs.com](https://cloud.pupil-labs.com/). It is the recommended tool for handling your Neon recordings.
+Pupil Cloud is our web-based storage and analysis platform located at [cloud.pupil-labs.com](https://cloud.pupil-labs.com/). It is the recommended tool for handling your Pupil Invisible recordings.
It makes it easy to store all your data securely in one place and it offers a variety of options for analysis.

@@ -23,22 +23,22 @@ From here, you can either download the raw data in a convenient format or use so
We have a strict [privacy policy](https://pupil-labs.com/legal/) that ensures your recording data is accessible only by you and those you explicitly grant access to. Pupil Labs will never access your recording data unless you explicitly instruct us to.
-If Cloud upload is enabled in the Neon Companion app, then recordings will be uploaded automatically to Pupil Cloud. No additional effort is required for data logistics.
+If Cloud upload is enabled in the Pupil Invisible Companion app, then recordings will be uploaded automatically to Pupil Cloud. No additional effort is required for data logistics.
-## Neon Monitor
+## Pupil Invisible Monitor
-
+
-All data generated by Neon can be monitored in real-time using the Neon Monitor app. To access the app simply visit [neon.local:8080](http://neon.local:8080) in your browser while being connected to the same WiFi network as the Companion device.
+All data generated by Pupil Invisible can be monitored in real-time using the Pupil Invisible Monitor app. To access the app simply visit [pi.local:8080](http://pi.local:8080) in your browser while being connected to the same WiFi network as the Companion device.
-The real-time streaming app enables you to monitor and control a recording session remotely without having to directly interact with the Companion device carried by the subject. You can view the scene video and gaze data remotely in your browser. You can also start and stop recordings and annotate important moments in time with [events](https://docs.pupil-labs.com/neon/data-collection/events/). All data will be saved with the recording on the Companion device.
+The real-time streaming app enables you to monitor and control a recording session remotely without having to directly interact with the Companion device carried by the subject. You can view the scene video and gaze data remotely in your browser. You can also start and stop recordings and annotate important moments in time with [events](/data-collection/events/). All data will be saved with the recording on the Companion device.
## Real-Time API
-All data generated by Neon is accessible to developers in real-time via our **Real-Time API**. Similar to the Neon Monitor app, it allows you to stream data, remotely start/stop recordings and save events. The only requirement is that the Companion device and the computer using the API are connected to the same WiFi network.
+All data generated by Pupil Invisible is accessible to developers in real-time via our **Real-Time API**. Similar to the Pupil Invisible Monitor app, it allows you to stream data, remotely start/stop recordings and save events. The only requirement is that the Companion device and the computer using the API are connected to the same WiFi network.
-This enables you to e.g. implement Human Computer Interaction (HCI) applications with Neon or to streamline your experiments by remotely controlling your devices and saving events automatically.
+This enables you to e.g. implement Human Computer Interaction (HCI) applications with Pupil Invisible or to streamline your experiments by remotely controlling your devices and saving events automatically.
-Check-out out our [real-time API tutorial](https://docs.pupil-labs.com/neon/real-time-api/tutorials/) to learn more!
+Check-out out our [real-time API tutorial](/real-time-api/tutorials/) to learn more!
For a concrete usage example, see [Track your Experiment Progress using Events](/real-time-api/track-your-experiment-progress-using-events/)!
diff --git a/landing-page/.vitepress/config.mts b/landing-page/.vitepress/config.mts
index da30ed3ef..38ef9d17d 100644
--- a/landing-page/.vitepress/config.mts
+++ b/landing-page/.vitepress/config.mts
@@ -1,17 +1,17 @@
import { fileURLToPath, URL } from "node:url";
import { defineConfig } from "vitepress";
-const url = `https://docs.pupil-labs.com/`;
-
-import { config as default_config } from "./../../default_config.mts";
-import { theme_config as default_theme_config } from "./../../default_config.mts";
+import {
+ config as default_config,
+ theme_config as default_theme_config,
+} from "./../../default_config.mts";
const theme_config_additions = {
nav: [
- { text: "Neon", link: url + "neon/", target: "_self" },
- { text: "Invisible", link: url + "invisible/", target: "_self" },
- { text: "Core", link: url + "core/", target: "_self" },
- { text: "Alpha Lab", link: url + "alpha-lab/", target: "_self" },
+ { text: "Neon", link: "/neon/", target: "_self" },
+ { text: "Invisible", link: "/invisible/", target: "_self" },
+ { text: "Core", link: "/core/", target: "_self" },
+ { text: "Alpha Lab", link: "/alpha-lab/", target: "_self" },
],
socialLinks: [
{ icon: "discord", link: "https://pupil-labs.com/chat" },
@@ -19,7 +19,7 @@ const theme_config_additions = {
icon: {
svg: '',
},
- link: "https://docs.pupil-labs.com/alpha-lab",
+ link: "/alpha-lab",
target: "_self",
},
],
@@ -34,6 +34,24 @@ const config_additions = {
title: "Home",
titleTemplate: ":title - Pupil Labs Docs",
description: "Documentation for all Pupil Labs products.",
+ head: [
+ [
+ "script",
+ {},
+ `
+ if (navigator.serviceWorker) {
+ navigator.serviceWorker.getRegistrations().then(
+ function (registrations) {
+ for (let idx in registrations) {
+ registrations[idx].unregister()
+ }
+ }
+ )
+ }
+
+ `,
+ ],
+ ],
vite: {
resolve: {
alias: [
diff --git a/landing-page/index.md b/landing-page/index.md
index 51aff05b3..36b81f11c 100644
--- a/landing-page/index.md
+++ b/landing-page/index.md
@@ -11,53 +11,53 @@ products:
details: Neon is our most powerful eye tracker. It features research-grade gaze and pupil diameter estimation, industry-leading robustness in real-world applications, and a pleasent calibration-free user experience.
image: "/neon-doc.webp"
link:
- { text: "Enter Documentation", href: "https://docs.pupil-labs.com/neon/" }
+ { text: "Enter Documentation", href: "/neon/" }
- title: Pupil Invisible
details: Pupil Invisible was the world's first calibration-free eye tracker and set new standards for robustness in real-world applications. It is now deprecated and replaced by Neon.
image: "/invisible-doc.webp"
link:
{
text: "Enter Documentation",
- href: "https://docs.pupil-labs.com/invisible/",
+ href: "/invisible/",
}
- title: Pupil Core
details: Pupil Core is a modular and more affordable eye tracker. It comes with open-source software and was the first eye tracker Pupil Labs created.
image: "/core-doc.webp"
link:
- { text: "Enter Documentation", href: "https://docs.pupil-labs.com/core/" }
+ { text: "Enter Documentation", href: "/core/" }
alpha:
title: "Alpha Lab"
tagline: "This is the space for our latest experiments with our tools. Alpha Lab is not a place for official product documentation. Everything you find here should be considered a work in progress, and may even be a bit rough around the edges."
action:
- { text: View all articles, link: "https://docs.pupil-labs.com/alpha-lab/" }
+ { text: View all articles, link: "/alpha-lab/" }
cards:
- title: Define AOIs and Calculate Gaze Metrics
details: Here we demonstrate how to make areas of interest using data downloaded from Pupil Cloud’s Reference Image Mapper.
link:
{
text: View,
- href: "https://docs.pupil-labs.com/alpha-lab/gaze-metrics-in-aois/#define-aois-and-calculate-gaze-metrics",
+ href: "/alpha-lab/gaze-metrics-in-aois/#define-aois-and-calculate-gaze-metrics",
}
- title: Map and visualise gaze onto a display content using the Reference Image Mapper
details: Here we show you how you can use Pupil Cloud’s Reference Image Mapper to map gaze onto dynamic on-screen content - like a video.
link:
{
text: View,
- href: "https://docs.pupil-labs.com/alpha-lab/map-your-gaze-to-a-2d-screen/#map-and-visualise-gaze-onto-a-display-content-using-the-reference-image-mapper",
+ href: "/alpha-lab/map-your-gaze-to-a-2d-screen/#map-and-visualise-gaze-onto-a-display-content-using-the-reference-image-mapper",
}
- title: Map gaze onto body parts using DensePose
details: Use detectron's densepose AI to segment and know at which part of a body a person is looking.
link:
{
text: View,
- href: "https://docs.pupil-labs.com/alpha-lab/dense-pose/#map-gaze-onto-body-parts-using-densepose",
+ href: "/alpha-lab/dense-pose/#map-gaze-onto-body-parts-using-densepose",
}
- title: Map and visualize gaze on multiple reference images taken from the same environment
details: We pushed the limits of markerless mapping with Pupil Cloud’s Reference Image Mapper - scanning an entire apartment.
link:
{
text: View,
- href: "https://docs.pupil-labs.com/alpha-lab/multiple-rim/#map-and-visualize-gaze-on-multiple-reference-images-taken-from-the-same-environment",
+ href: "/alpha-lab/multiple-rim/#map-and-visualize-gaze-on-multiple-reference-images-taken-from-the-same-environment",
}
---
diff --git a/landing-page/public/service-worker.js b/landing-page/public/service-worker.js
new file mode 100644
index 000000000..05eb5c015
--- /dev/null
+++ b/landing-page/public/service-worker.js
@@ -0,0 +1,11 @@
+console.log("Cleanup Service Worker Starting");
+
+caches.keys()
+ .then(keys =>
+ Promise.all(
+ keys.map(async key => console.log("caches.delete", key, await caches.delete(key)))))
+ .then(async () => {
+ console.log("registration.unregister", await registration.unregister());
+ })
+ .then(() => console.log("DONE"))
+ .catch(console.error);
\ No newline at end of file
diff --git a/neon/pupil-cloud/enrichments/reference-image-mapper/index.md b/neon/pupil-cloud/enrichments/reference-image-mapper/index.md
index 21e815fc1..964568bda 100644
--- a/neon/pupil-cloud/enrichments/reference-image-mapper/index.md
+++ b/neon/pupil-cloud/enrichments/reference-image-mapper/index.md
@@ -13,11 +13,11 @@ A heatmap of gaze data mapped onto the reference image can be generated, and map
As described in the setup video, you will need two things in addition to your eye tracking recording(s) to produce a Reference Image Mapper enrichment:
1. A reference image
-2. A scanning video of the object/feature(s) taken with Invisible/Neon’s scene camera
+2. A scanning video of the object/feature(s) taken with Neon’s scene camera
:::tip
**Reference Image**
-Only the scanning recording needs to be taken with Invisible/Neon. The reference image can be taken with any camera. It can be also be a scanned image.
+Only the scanning recording needs to be taken with Neon. The reference image can be taken with any camera. It can be also be a scanned image.
Duration
diff --git a/neon/real-time-api/track-your-experiment-in-matlab/index.md b/neon/real-time-api/track-your-experiment-in-matlab/index.md
index f89d99999..7b87f5144 100644
--- a/neon/real-time-api/track-your-experiment-in-matlab/index.md
+++ b/neon/real-time-api/track-your-experiment-in-matlab/index.md
@@ -145,7 +145,7 @@ Several arguments can be used to control the wrapper:
- The default is `status`.
See the section [Running the demo](#running-the-demo) on how to use them.
- `EventName`: followed by a string with the annotation name for the event, default is `Test event`.
-- `URLhost`: followed by a string containing the URL of Pupil Invisible, default is `http://neon.local:8080/`. It's generally good practice to call directly to the URL. Nevertheless, Matlab does resolve the DNS and stores it in the cache, so you will only notice a delay in the first call.
+- `URLhost`: followed by a string containing the URL of Neon, default is `http://neon.local:8080/`. It's generally good practice to call directly to the URL. Nevertheless, Matlab does resolve the DNS and stores it in the cache, so you will only notice a delay in the first call.
## Notes and Disclaimers