From 04ad33896757b9dc1b19a069444ef18f98004b85 Mon Sep 17 00:00:00 2001 From: Jonah Ramponi Date: Sat, 30 Mar 2024 19:10:00 +0000 Subject: [PATCH] correctingim g --- content/posts/intro_to_attention.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/content/posts/intro_to_attention.md b/content/posts/intro_to_attention.md index f435d50..bf01018 100644 --- a/content/posts/intro_to_attention.md +++ b/content/posts/intro_to_attention.md @@ -153,7 +153,7 @@ Well, our *attention matrix* after softmax has been applied is simply $w$ with $ The attention matrix is a nice thing to visualize. For our toy example, it might look like -1[Attention Matrix Visualisation](/img/attnm.png) +![Attention Matrix Visualisation](/img/attnm.png) What can we notice about our attention matrix?