From e5785fa91740f0d7247b2f35c11ab56f564efd30 Mon Sep 17 00:00:00 2001 From: nadje Date: Mon, 8 Apr 2024 10:28:09 +0200 Subject: [PATCH] yt link and text addition --- alpha-lab/gaze-on-face/index.md | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/alpha-lab/gaze-on-face/index.md b/alpha-lab/gaze-on-face/index.md index 4303b13ed..743119dc4 100644 --- a/alpha-lab/gaze-on-face/index.md +++ b/alpha-lab/gaze-on-face/index.md @@ -8,7 +8,7 @@ meta: - name: twitter:image content: "https://i.ytimg.com/vi/nt_zNSBMJWI/maxresdefault.jpg" - name: twitter:player - content: "https://www.youtube.com/embed/aWkFag5Z8LU" + content: "https://www.youtube.com/embed/vEshPxgWs3E" - name: twitter:width content: "1280" - name: twitter:height @@ -26,7 +26,7 @@ import TagLinks from '@components/TagLinks.vue' - + ::: tip Decode the cues of social interactions and cognition: Does your attention drift towards your friend's eyes, or do you gaze at their lips as they talk? This guide shows how to map gaze on facial landmarks using the output of our Face Mapper enrichment. @@ -49,6 +49,8 @@ reveal _which_ facial landmarks were gazed at. That is, did they look at the mou This guide introduces a tool to generate Areas of Interest (AOIs) around each facial landmark (like in the video above) and correlate the wearer's gaze with these. In other words, it automatically identifies which facial landmarks were gazed at on the interlocutor's face. +It also computes the percentage of gaze mapped on each facial landmark, but you can use the results to compute your own outcome metrics. + ## Setting Up Your Experiment When setting up your experiment, accurate mapping of gaze on facial landmarks depends on three crucial factors: the proximity