Skip to content

Commit

Permalink
Dont use mathcal which mathjax default renders as squares on Colab+macOS
Browse files Browse the repository at this point in the history
  • Loading branch information
patnr committed Sep 10, 2024
1 parent 49637ae commit 93159e9
Show file tree
Hide file tree
Showing 10 changed files with 30 additions and 30 deletions.
10 changes: 5 additions & 5 deletions notebooks/T2 - Gaussian distribution.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -64,14 +64,14 @@
"metadata": {},
"source": [
"## The univariate (a.k.a. 1-dimensional, scalar) case\n",
"Consider the Gaussian random variable $x \\sim \\mathcal{N}(\\mu, \\sigma^2)$. \n",
"Consider the Gaussian random variable $x \\sim \\mathscr{N}(\\mu, \\sigma^2)$. \n",
"Its probability density function (**pdf**),\n",
"$\n",
"p(x) = \\mathcal{N}(x \\mid \\mu, \\sigma^2)\n",
"p(x) = \\mathscr{N}(x \\mid \\mu, \\sigma^2)\n",
"$ for $x \\in (-\\infty, +\\infty)$,\n",
"is given by\n",
"$$\\begin{align}\n",
"\\mathcal{N}(x \\mid \\mu, \\sigma^2) = (2 \\pi \\sigma^2)^{-1/2} e^{-(x-\\mu)^2/2 \\sigma^2} \\,. \\tag{G1}\n",
"\\mathscr{N}(x \\mid \\mu, \\sigma^2) = (2 \\pi \\sigma^2)^{-1/2} e^{-(x-\\mu)^2/2 \\sigma^2} \\,. \\tag{G1}\n",
"\\end{align}$$\n",
"\n",
"Run the cell below to define a function to compute the pdf (G1) using the `scipy` library."
Expand Down Expand Up @@ -157,7 +157,7 @@
"id": "94b6d541",
"metadata": {},
"source": [
"**Exc -- Derivatives:** Recall $p(x) = \\mathcal{N}(x \\mid \\mu, \\sigma^2)$ from eqn (G1). \n",
"**Exc -- Derivatives:** Recall $p(x) = \\mathscr{N}(x \\mid \\mu, \\sigma^2)$ from eqn (G1). \n",
"Use pen, paper, and calculus to answer the following questions, \n",
"which derive some helpful mnemonics about the distribution.\n",
"\n",
Expand Down Expand Up @@ -201,7 +201,7 @@
"metadata": {},
"source": [
"#### Exc (optional) -- Integrals\n",
"Recall $p(x) = \\mathcal{N}(x \\mid \\mu, \\sigma^2)$ from eqn (G1). Abbreviate it using $c = (2 \\pi \\sigma^2)^{-1/2}$. \n",
"Recall $p(x) = \\mathscr{N}(x \\mid \\mu, \\sigma^2)$ from eqn (G1). Abbreviate it using $c = (2 \\pi \\sigma^2)^{-1/2}$. \n",
"Use pen, paper, and calculus to show that\n",
" - (i) the first parameter, $\\mu$, indicates its **mean**, i.e. that $$\\mu = \\Expect[x] \\,.$$\n",
" *Hint: you can rely on the result of (iii)*\n",
Expand Down
14 changes: 7 additions & 7 deletions notebooks/T3 - Bayesian inference.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@
"studied the Gaussian probability density function (pdf), defined by:\n",
"\n",
"$$\\begin{align}\n",
"\\mathcal{N}(x \\mid \\mu, \\sigma^2) &= (2 \\pi \\sigma^2)^{-1/2} e^{-(x-\\mu)^2/2 \\sigma^2} \\,,\\tag{G1} \\\\\n",
"\\mathscr{N}(x \\mid \\mu, \\sigma^2) &= (2 \\pi \\sigma^2)^{-1/2} e^{-(x-\\mu)^2/2 \\sigma^2} \\,,\\tag{G1} \\\\\n",
"\\NormDist(\\x \\mid \\mathbf{\\mu}, \\mathbf{\\Sigma})\n",
"&=\n",
"|2 \\pi \\mathbf{\\Sigma}|^{-1/2} \\, \\exp\\Big(-\\frac{1}{2}\\|\\x-\\mathbf{\\mu}\\|^2_\\mathbf{\\Sigma} \\Big) \\,, \\tag{GM}\n",
Expand Down Expand Up @@ -476,12 +476,12 @@
"In response to this computational difficulty, we try to be smart and do something more analytical (\"pen-and-paper\"): we only compute the parameters (mean and (co)variance) of the posterior pdf.\n",
"\n",
"This is doable and quite simple in the Gaussian-Gaussian case, when $\\ObsMod$ is linear (i.e. just a number): \n",
"- Given the prior of $p(x) = \\mathcal{N}(x \\mid x\\supf, P\\supf)$\n",
"- and a likelihood $p(y|x) = \\mathcal{N}(y \\mid \\ObsMod x,R)$, \n",
"- Given the prior of $p(x) = \\mathscr{N}(x \\mid x\\supf, P\\supf)$\n",
"- and a likelihood $p(y|x) = \\mathscr{N}(y \\mid \\ObsMod x,R)$, \n",
"- $\\implies$ posterior\n",
"$\n",
"p(x|y)\n",
"= \\mathcal{N}(x \\mid x\\supa, P\\supa) \\,,\n",
"= \\mathscr{N}(x \\mid x\\supa, P\\supa) \\,,\n",
"$\n",
"where, in the 1-dimensional/univariate/scalar (multivariate is discussed in [T5](T5%20-%20Multivariate%20Kalman%20filter.ipynb)) case:\n",
"\n",
Expand All @@ -501,7 +501,7 @@
"- (a) Actually derive the first term of the RHS, i.e. eqns. (5) and (6). \n",
" *Hint: you can simplify the task by first \"hiding\" $\\ObsMod$ by astutely multiplying by $1$ somewhere.*\n",
"- (b) *Optional*: Derive the full RHS (i.e. also the second term).\n",
"- (c) Derive $p(x|y) = \\mathcal{N}(x \\mid x\\supa, P\\supa)$ from eqns. (5) and (6)\n",
"- (c) Derive $p(x|y) = \\mathscr{N}(x \\mid x\\supa, P\\supa)$ from eqns. (5) and (6)\n",
" using part (a), Bayes' rule (BR2), and the Gaussian pdf (G1)."
]
},
Expand All @@ -522,11 +522,11 @@
"source": [
"**Exc -- Temperature example:**\n",
"The statement $x = \\mu \\pm \\sigma$ is *sometimes* used\n",
"as a shorthand for $p(x) = \\mathcal{N}(x \\mid \\mu, \\sigma^2)$. Suppose\n",
"as a shorthand for $p(x) = \\mathscr{N}(x \\mid \\mu, \\sigma^2)$. Suppose\n",
"- you think the temperature $x = 20°C \\pm 2°C$,\n",
"- a thermometer yields the observation $y = 18°C \\pm 2°C$.\n",
"\n",
"Show that your posterior is $p(x|y) = \\mathcal{N}(x \\mid 19, 2)$"
"Show that your posterior is $p(x|y) = \\mathscr{N}(x \\mid 19, 2)$"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion notebooks/T4 - Time series filtering.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -208,7 +208,7 @@
"Formulae (5) and (6) are called the **forecast step** of the KF.\n",
"But when $y_1$ becomes available, according to eqn. (Obs),\n",
"then we can update/condition our estimate of $x_1$, i.e. compute the posterior,\n",
"$p(x_1 | y_1) = \\mathcal{N}(x_1 \\mid x\\supa_1, P\\supa_1) \\,,$\n",
"$p(x_1 | y_1) = \\mathscr{N}(x_1 \\mid x\\supa_1, P\\supa_1) \\,,$\n",
"using the formulae we developed for Bayes' rule with\n",
"[Gaussian distributions](T3%20-%20Bayesian%20inference.ipynb#Gaussian-Gaussian-Bayes'-rule-(1D)).\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion notebooks/T8 - Monte-Carlo & ensembles.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@
"\n",
"**Exc -- Multivariate Gaussian sampling:**\n",
"Suppose $\\z$ is a standard Gaussian,\n",
"i.e. $p(\\z) = \\mathcal{N}(\\z \\mid \\bvec{0},\\I_{\\xDim})$,\n",
"i.e. $p(\\z) = \\mathscr{N}(\\z \\mid \\bvec{0},\\I_{\\xDim})$,\n",
"where $\\I_{\\xDim}$ is the $\\xDim$-dimensional identity matrix. \n",
"Let $\\x = \\mat{L}\\z + \\mu$.\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion notebooks/resources/answers.py
Original file line number Diff line number Diff line change
Expand Up @@ -592,7 +592,7 @@ def setup_typeset():
$$ \begin{align}
p\, (y_1, \ldots, y_K \;|\; a)
&= \prod_{k=1}^K \, p\, (y_k \;|\; a) \tag{each obs. is indep. of others, knowing $a$.} \\\
&= \prod_k \, \mathcal{N}(y_k \mid a k,R) \tag{inserted eqn. (3) and then (1).} \\\
&= \prod_k \, \NormDist{N}(y_k \mid a k,R) \tag{inserted eqn. (3) and then (1).} \\\
&= \prod_k \, (2 \pi R)^{-1/2} e^{-(y_k - a k)^2/2 R} \tag{inserted eqn. (G1) from T2.} \\\
&= c \exp\Big(\frac{-1}{2 R}\sum_k (y_k - a k)^2\Big) \\\
\end{align} $$
Expand Down
2 changes: 1 addition & 1 deletion notebooks/resources/macros.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
macros=r'''
\newcommand{\Reals}{\mathbb{R}}
\newcommand{\Expect}[0]{\mathbb{E}}
\newcommand{\NormDist}{\mathcal{N}}
\newcommand{\NormDist}{\mathscr{N}}
\newcommand{\DynMod}[0]{\mathscr{M}}
\newcommand{\ObsMod}[0]{\mathscr{H}}
Expand Down
10 changes: 5 additions & 5 deletions notebooks/scripts/T2 - Gaussian distribution.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,14 +43,14 @@


# ## The univariate (a.k.a. 1-dimensional, scalar) case
# Consider the Gaussian random variable $x \sim \mathcal{N}(\mu, \sigma^2)$.
# Consider the Gaussian random variable $x \sim \mathscr{N}(\mu, \sigma^2)$.
# Its probability density function (**pdf**),
# $
# p(x) = \mathcal{N}(x \mid \mu, \sigma^2)
# p(x) = \mathscr{N}(x \mid \mu, \sigma^2)
# $ for $x \in (-\infty, +\infty)$,
# is given by
# $$\begin{align}
# \mathcal{N}(x \mid \mu, \sigma^2) = (2 \pi \sigma^2)^{-1/2} e^{-(x-\mu)^2/2 \sigma^2} \,. \tag{G1}
# \mathscr{N}(x \mid \mu, \sigma^2) = (2 \pi \sigma^2)^{-1/2} e^{-(x-\mu)^2/2 \sigma^2} \,. \tag{G1}
# \end{align}$$
#
# Run the cell below to define a function to compute the pdf (G1) using the `scipy` library.
Expand Down Expand Up @@ -89,7 +89,7 @@ def plot_pdf(mu=0, sigma=5):
# show_answer('pdf_G1')
# -

# **Exc -- Derivatives:** Recall $p(x) = \mathcal{N}(x \mid \mu, \sigma^2)$ from eqn (G1).
# **Exc -- Derivatives:** Recall $p(x) = \mathscr{N}(x \mid \mu, \sigma^2)$ from eqn (G1).
# Use pen, paper, and calculus to answer the following questions,
# which derive some helpful mnemonics about the distribution.
#
Expand All @@ -115,7 +115,7 @@ def plot_pdf(mu=0, sigma=5):
# -

# #### Exc (optional) -- Integrals
# Recall $p(x) = \mathcal{N}(x \mid \mu, \sigma^2)$ from eqn (G1). Abbreviate it using $c = (2 \pi \sigma^2)^{-1/2}$.
# Recall $p(x) = \mathscr{N}(x \mid \mu, \sigma^2)$ from eqn (G1). Abbreviate it using $c = (2 \pi \sigma^2)^{-1/2}$.
# Use pen, paper, and calculus to show that
# - (i) the first parameter, $\mu$, indicates its **mean**, i.e. that $$\mu = \Expect[x] \,.$$
# *Hint: you can rely on the result of (iii)*
Expand Down
14 changes: 7 additions & 7 deletions notebooks/scripts/T3 - Bayesian inference.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@
# studied the Gaussian probability density function (pdf), defined by:
#
# $$\begin{align}
# \mathcal{N}(x \mid \mu, \sigma^2) &= (2 \pi \sigma^2)^{-1/2} e^{-(x-\mu)^2/2 \sigma^2} \,,\tag{G1} \\
# \mathscr{N}(x \mid \mu, \sigma^2) &= (2 \pi \sigma^2)^{-1/2} e^{-(x-\mu)^2/2 \sigma^2} \,,\tag{G1} \\
# \NormDist(\x \mid \mathbf{\mu}, \mathbf{\Sigma})
# &=
# |2 \pi \mathbf{\Sigma}|^{-1/2} \, \exp\Big(-\frac{1}{2}\|\x-\mathbf{\mu}\|^2_\mathbf{\Sigma} \Big) \,, \tag{GM}
Expand Down Expand Up @@ -299,12 +299,12 @@ def Bayes2( corr_R =.6, y1=1, R1=4**2,
# In response to this computational difficulty, we try to be smart and do something more analytical ("pen-and-paper"): we only compute the parameters (mean and (co)variance) of the posterior pdf.
#
# This is doable and quite simple in the Gaussian-Gaussian case, when $\ObsMod$ is linear (i.e. just a number):
# - Given the prior of $p(x) = \mathcal{N}(x \mid x\supf, P\supf)$
# - and a likelihood $p(y|x) = \mathcal{N}(y \mid \ObsMod x,R)$,
# - Given the prior of $p(x) = \mathscr{N}(x \mid x\supf, P\supf)$
# - and a likelihood $p(y|x) = \mathscr{N}(y \mid \ObsMod x,R)$,
# - $\implies$ posterior
# $
# p(x|y)
# = \mathcal{N}(x \mid x\supa, P\supa) \,,
# = \mathscr{N}(x \mid x\supa, P\supa) \,,
# $
# where, in the 1-dimensional/univariate/scalar (multivariate is discussed in [T5](T5%20-%20Multivariate%20Kalman%20filter.ipynb)) case:
#
Expand All @@ -324,7 +324,7 @@ def Bayes2( corr_R =.6, y1=1, R1=4**2,
# - (a) Actually derive the first term of the RHS, i.e. eqns. (5) and (6).
# *Hint: you can simplify the task by first "hiding" $\ObsMod$ by astutely multiplying by $1$ somewhere.*
# - (b) *Optional*: Derive the full RHS (i.e. also the second term).
# - (c) Derive $p(x|y) = \mathcal{N}(x \mid x\supa, P\supa)$ from eqns. (5) and (6)
# - (c) Derive $p(x|y) = \mathscr{N}(x \mid x\supa, P\supa)$ from eqns. (5) and (6)
# using part (a), Bayes' rule (BR2), and the Gaussian pdf (G1).

# +
Expand All @@ -333,11 +333,11 @@ def Bayes2( corr_R =.6, y1=1, R1=4**2,

# **Exc -- Temperature example:**
# The statement $x = \mu \pm \sigma$ is *sometimes* used
# as a shorthand for $p(x) = \mathcal{N}(x \mid \mu, \sigma^2)$. Suppose
# as a shorthand for $p(x) = \mathscr{N}(x \mid \mu, \sigma^2)$. Suppose
# - you think the temperature $x = 20°C \pm 2°C$,
# - a thermometer yields the observation $y = 18°C \pm 2°C$.
#
# Show that your posterior is $p(x|y) = \mathcal{N}(x \mid 19, 2)$
# Show that your posterior is $p(x|y) = \mathscr{N}(x \mid 19, 2)$

# +
# show_answer('GG BR example')
Expand Down
2 changes: 1 addition & 1 deletion notebooks/scripts/T4 - Time series filtering.py
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@ def exprmt(seed=4, nTime=50, M=0.97, logR=1, logQ=1, analyses_only=False, logR_b
# Formulae (5) and (6) are called the **forecast step** of the KF.
# But when $y_1$ becomes available, according to eqn. (Obs),
# then we can update/condition our estimate of $x_1$, i.e. compute the posterior,
# $p(x_1 | y_1) = \mathcal{N}(x_1 \mid x\supa_1, P\supa_1) \,,$
# $p(x_1 | y_1) = \mathscr{N}(x_1 \mid x\supa_1, P\supa_1) \,,$
# using the formulae we developed for Bayes' rule with
# [Gaussian distributions](T3%20-%20Bayesian%20inference.ipynb#Gaussian-Gaussian-Bayes'-rule-(1D)).
#
Expand Down
2 changes: 1 addition & 1 deletion notebooks/scripts/T8 - Monte-Carlo & ensembles.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ def pdf_reconstructions(seed=5, nbins=10, bw=.3):
#
# **Exc -- Multivariate Gaussian sampling:**
# Suppose $\z$ is a standard Gaussian,
# i.e. $p(\z) = \mathcal{N}(\z \mid \bvec{0},\I_{\xDim})$,
# i.e. $p(\z) = \mathscr{N}(\z \mid \bvec{0},\I_{\xDim})$,
# where $\I_{\xDim}$ is the $\xDim$-dimensional identity matrix.
# Let $\x = \mat{L}\z + \mu$.
#
Expand Down

0 comments on commit 93159e9

Please sign in to comment.