diff --git a/notebooks/T2 - Gaussian distribution.ipynb b/notebooks/T2 - Gaussian distribution.ipynb index 9767176..c75e7d1 100644 --- a/notebooks/T2 - Gaussian distribution.ipynb +++ b/notebooks/T2 - Gaussian distribution.ipynb @@ -64,14 +64,14 @@ "metadata": {}, "source": [ "## The univariate (a.k.a. 1-dimensional, scalar) case\n", - "Consider the Gaussian random variable $x \\sim \\mathcal{N}(\\mu, \\sigma^2)$. \n", + "Consider the Gaussian random variable $x \\sim \\mathscr{N}(\\mu, \\sigma^2)$. \n", "Its probability density function (**pdf**),\n", "$\n", - "p(x) = \\mathcal{N}(x \\mid \\mu, \\sigma^2)\n", + "p(x) = \\mathscr{N}(x \\mid \\mu, \\sigma^2)\n", "$ for $x \\in (-\\infty, +\\infty)$,\n", "is given by\n", "$$\\begin{align}\n", - "\\mathcal{N}(x \\mid \\mu, \\sigma^2) = (2 \\pi \\sigma^2)^{-1/2} e^{-(x-\\mu)^2/2 \\sigma^2} \\,. \\tag{G1}\n", + "\\mathscr{N}(x \\mid \\mu, \\sigma^2) = (2 \\pi \\sigma^2)^{-1/2} e^{-(x-\\mu)^2/2 \\sigma^2} \\,. \\tag{G1}\n", "\\end{align}$$\n", "\n", "Run the cell below to define a function to compute the pdf (G1) using the `scipy` library." @@ -157,7 +157,7 @@ "id": "94b6d541", "metadata": {}, "source": [ - "**Exc -- Derivatives:** Recall $p(x) = \\mathcal{N}(x \\mid \\mu, \\sigma^2)$ from eqn (G1). \n", + "**Exc -- Derivatives:** Recall $p(x) = \\mathscr{N}(x \\mid \\mu, \\sigma^2)$ from eqn (G1). \n", "Use pen, paper, and calculus to answer the following questions, \n", "which derive some helpful mnemonics about the distribution.\n", "\n", @@ -201,7 +201,7 @@ "metadata": {}, "source": [ "#### Exc (optional) -- Integrals\n", - "Recall $p(x) = \\mathcal{N}(x \\mid \\mu, \\sigma^2)$ from eqn (G1). Abbreviate it using $c = (2 \\pi \\sigma^2)^{-1/2}$. \n", + "Recall $p(x) = \\mathscr{N}(x \\mid \\mu, \\sigma^2)$ from eqn (G1). Abbreviate it using $c = (2 \\pi \\sigma^2)^{-1/2}$. \n", "Use pen, paper, and calculus to show that\n", " - (i) the first parameter, $\\mu$, indicates its **mean**, i.e. that $$\\mu = \\Expect[x] \\,.$$\n", " *Hint: you can rely on the result of (iii)*\n", diff --git a/notebooks/T3 - Bayesian inference.ipynb b/notebooks/T3 - Bayesian inference.ipynb index 53a9fac..1a6b2a5 100644 --- a/notebooks/T3 - Bayesian inference.ipynb +++ b/notebooks/T3 - Bayesian inference.ipynb @@ -48,7 +48,7 @@ "studied the Gaussian probability density function (pdf), defined by:\n", "\n", "$$\\begin{align}\n", - "\\mathcal{N}(x \\mid \\mu, \\sigma^2) &= (2 \\pi \\sigma^2)^{-1/2} e^{-(x-\\mu)^2/2 \\sigma^2} \\,,\\tag{G1} \\\\\n", + "\\mathscr{N}(x \\mid \\mu, \\sigma^2) &= (2 \\pi \\sigma^2)^{-1/2} e^{-(x-\\mu)^2/2 \\sigma^2} \\,,\\tag{G1} \\\\\n", "\\NormDist(\\x \\mid \\mathbf{\\mu}, \\mathbf{\\Sigma})\n", "&=\n", "|2 \\pi \\mathbf{\\Sigma}|^{-1/2} \\, \\exp\\Big(-\\frac{1}{2}\\|\\x-\\mathbf{\\mu}\\|^2_\\mathbf{\\Sigma} \\Big) \\,, \\tag{GM}\n", @@ -476,12 +476,12 @@ "In response to this computational difficulty, we try to be smart and do something more analytical (\"pen-and-paper\"): we only compute the parameters (mean and (co)variance) of the posterior pdf.\n", "\n", "This is doable and quite simple in the Gaussian-Gaussian case, when $\\ObsMod$ is linear (i.e. just a number): \n", - "- Given the prior of $p(x) = \\mathcal{N}(x \\mid x\\supf, P\\supf)$\n", - "- and a likelihood $p(y|x) = \\mathcal{N}(y \\mid \\ObsMod x,R)$, \n", + "- Given the prior of $p(x) = \\mathscr{N}(x \\mid x\\supf, P\\supf)$\n", + "- and a likelihood $p(y|x) = \\mathscr{N}(y \\mid \\ObsMod x,R)$, \n", "- $\\implies$ posterior\n", "$\n", "p(x|y)\n", - "= \\mathcal{N}(x \\mid x\\supa, P\\supa) \\,,\n", + "= \\mathscr{N}(x \\mid x\\supa, P\\supa) \\,,\n", "$\n", "where, in the 1-dimensional/univariate/scalar (multivariate is discussed in [T5](T5%20-%20Multivariate%20Kalman%20filter.ipynb)) case:\n", "\n", @@ -501,7 +501,7 @@ "- (a) Actually derive the first term of the RHS, i.e. eqns. (5) and (6). \n", " *Hint: you can simplify the task by first \"hiding\" $\\ObsMod$ by astutely multiplying by $1$ somewhere.*\n", "- (b) *Optional*: Derive the full RHS (i.e. also the second term).\n", - "- (c) Derive $p(x|y) = \\mathcal{N}(x \\mid x\\supa, P\\supa)$ from eqns. (5) and (6)\n", + "- (c) Derive $p(x|y) = \\mathscr{N}(x \\mid x\\supa, P\\supa)$ from eqns. (5) and (6)\n", " using part (a), Bayes' rule (BR2), and the Gaussian pdf (G1)." ] }, @@ -522,11 +522,11 @@ "source": [ "**Exc -- Temperature example:**\n", "The statement $x = \\mu \\pm \\sigma$ is *sometimes* used\n", - "as a shorthand for $p(x) = \\mathcal{N}(x \\mid \\mu, \\sigma^2)$. Suppose\n", + "as a shorthand for $p(x) = \\mathscr{N}(x \\mid \\mu, \\sigma^2)$. Suppose\n", "- you think the temperature $x = 20°C \\pm 2°C$,\n", "- a thermometer yields the observation $y = 18°C \\pm 2°C$.\n", "\n", - "Show that your posterior is $p(x|y) = \\mathcal{N}(x \\mid 19, 2)$" + "Show that your posterior is $p(x|y) = \\mathscr{N}(x \\mid 19, 2)$" ] }, { diff --git a/notebooks/T4 - Time series filtering.ipynb b/notebooks/T4 - Time series filtering.ipynb index b31faf9..325fc48 100644 --- a/notebooks/T4 - Time series filtering.ipynb +++ b/notebooks/T4 - Time series filtering.ipynb @@ -208,7 +208,7 @@ "Formulae (5) and (6) are called the **forecast step** of the KF.\n", "But when $y_1$ becomes available, according to eqn. (Obs),\n", "then we can update/condition our estimate of $x_1$, i.e. compute the posterior,\n", - "$p(x_1 | y_1) = \\mathcal{N}(x_1 \\mid x\\supa_1, P\\supa_1) \\,,$\n", + "$p(x_1 | y_1) = \\mathscr{N}(x_1 \\mid x\\supa_1, P\\supa_1) \\,,$\n", "using the formulae we developed for Bayes' rule with\n", "[Gaussian distributions](T3%20-%20Bayesian%20inference.ipynb#Gaussian-Gaussian-Bayes'-rule-(1D)).\n", "\n", diff --git a/notebooks/T8 - Monte-Carlo & ensembles.ipynb b/notebooks/T8 - Monte-Carlo & ensembles.ipynb index cbb336a..35e478e 100644 --- a/notebooks/T8 - Monte-Carlo & ensembles.ipynb +++ b/notebooks/T8 - Monte-Carlo & ensembles.ipynb @@ -109,7 +109,7 @@ "\n", "**Exc -- Multivariate Gaussian sampling:**\n", "Suppose $\\z$ is a standard Gaussian,\n", - "i.e. $p(\\z) = \\mathcal{N}(\\z \\mid \\bvec{0},\\I_{\\xDim})$,\n", + "i.e. $p(\\z) = \\mathscr{N}(\\z \\mid \\bvec{0},\\I_{\\xDim})$,\n", "where $\\I_{\\xDim}$ is the $\\xDim$-dimensional identity matrix. \n", "Let $\\x = \\mat{L}\\z + \\mu$.\n", "\n", diff --git a/notebooks/resources/answers.py b/notebooks/resources/answers.py index 813b62e..53e8163 100644 --- a/notebooks/resources/answers.py +++ b/notebooks/resources/answers.py @@ -592,7 +592,7 @@ def setup_typeset(): $$ \begin{align} p\, (y_1, \ldots, y_K \;|\; a) &= \prod_{k=1}^K \, p\, (y_k \;|\; a) \tag{each obs. is indep. of others, knowing $a$.} \\\ -&= \prod_k \, \mathcal{N}(y_k \mid a k,R) \tag{inserted eqn. (3) and then (1).} \\\ +&= \prod_k \, \NormDist{N}(y_k \mid a k,R) \tag{inserted eqn. (3) and then (1).} \\\ &= \prod_k \, (2 \pi R)^{-1/2} e^{-(y_k - a k)^2/2 R} \tag{inserted eqn. (G1) from T2.} \\\ &= c \exp\Big(\frac{-1}{2 R}\sum_k (y_k - a k)^2\Big) \\\ \end{align} $$ diff --git a/notebooks/resources/macros.py b/notebooks/resources/macros.py index b1a6c9d..94fd705 100755 --- a/notebooks/resources/macros.py +++ b/notebooks/resources/macros.py @@ -16,7 +16,7 @@ macros=r''' \newcommand{\Reals}{\mathbb{R}} \newcommand{\Expect}[0]{\mathbb{E}} -\newcommand{\NormDist}{\mathcal{N}} +\newcommand{\NormDist}{\mathscr{N}} \newcommand{\DynMod}[0]{\mathscr{M}} \newcommand{\ObsMod}[0]{\mathscr{H}} diff --git a/notebooks/scripts/T2 - Gaussian distribution.py b/notebooks/scripts/T2 - Gaussian distribution.py index 8688df7..a4e37c9 100644 --- a/notebooks/scripts/T2 - Gaussian distribution.py +++ b/notebooks/scripts/T2 - Gaussian distribution.py @@ -43,14 +43,14 @@ # ## The univariate (a.k.a. 1-dimensional, scalar) case -# Consider the Gaussian random variable $x \sim \mathcal{N}(\mu, \sigma^2)$. +# Consider the Gaussian random variable $x \sim \mathscr{N}(\mu, \sigma^2)$. # Its probability density function (**pdf**), # $ -# p(x) = \mathcal{N}(x \mid \mu, \sigma^2) +# p(x) = \mathscr{N}(x \mid \mu, \sigma^2) # $ for $x \in (-\infty, +\infty)$, # is given by # $$\begin{align} -# \mathcal{N}(x \mid \mu, \sigma^2) = (2 \pi \sigma^2)^{-1/2} e^{-(x-\mu)^2/2 \sigma^2} \,. \tag{G1} +# \mathscr{N}(x \mid \mu, \sigma^2) = (2 \pi \sigma^2)^{-1/2} e^{-(x-\mu)^2/2 \sigma^2} \,. \tag{G1} # \end{align}$$ # # Run the cell below to define a function to compute the pdf (G1) using the `scipy` library. @@ -89,7 +89,7 @@ def plot_pdf(mu=0, sigma=5): # show_answer('pdf_G1') # - -# **Exc -- Derivatives:** Recall $p(x) = \mathcal{N}(x \mid \mu, \sigma^2)$ from eqn (G1). +# **Exc -- Derivatives:** Recall $p(x) = \mathscr{N}(x \mid \mu, \sigma^2)$ from eqn (G1). # Use pen, paper, and calculus to answer the following questions, # which derive some helpful mnemonics about the distribution. # @@ -115,7 +115,7 @@ def plot_pdf(mu=0, sigma=5): # - # #### Exc (optional) -- Integrals -# Recall $p(x) = \mathcal{N}(x \mid \mu, \sigma^2)$ from eqn (G1). Abbreviate it using $c = (2 \pi \sigma^2)^{-1/2}$. +# Recall $p(x) = \mathscr{N}(x \mid \mu, \sigma^2)$ from eqn (G1). Abbreviate it using $c = (2 \pi \sigma^2)^{-1/2}$. # Use pen, paper, and calculus to show that # - (i) the first parameter, $\mu$, indicates its **mean**, i.e. that $$\mu = \Expect[x] \,.$$ # *Hint: you can rely on the result of (iii)* diff --git a/notebooks/scripts/T3 - Bayesian inference.py b/notebooks/scripts/T3 - Bayesian inference.py index 3bd9206..c712e3f 100644 --- a/notebooks/scripts/T3 - Bayesian inference.py +++ b/notebooks/scripts/T3 - Bayesian inference.py @@ -34,7 +34,7 @@ # studied the Gaussian probability density function (pdf), defined by: # # $$\begin{align} -# \mathcal{N}(x \mid \mu, \sigma^2) &= (2 \pi \sigma^2)^{-1/2} e^{-(x-\mu)^2/2 \sigma^2} \,,\tag{G1} \\ +# \mathscr{N}(x \mid \mu, \sigma^2) &= (2 \pi \sigma^2)^{-1/2} e^{-(x-\mu)^2/2 \sigma^2} \,,\tag{G1} \\ # \NormDist(\x \mid \mathbf{\mu}, \mathbf{\Sigma}) # &= # |2 \pi \mathbf{\Sigma}|^{-1/2} \, \exp\Big(-\frac{1}{2}\|\x-\mathbf{\mu}\|^2_\mathbf{\Sigma} \Big) \,, \tag{GM} @@ -299,12 +299,12 @@ def Bayes2( corr_R =.6, y1=1, R1=4**2, # In response to this computational difficulty, we try to be smart and do something more analytical ("pen-and-paper"): we only compute the parameters (mean and (co)variance) of the posterior pdf. # # This is doable and quite simple in the Gaussian-Gaussian case, when $\ObsMod$ is linear (i.e. just a number): -# - Given the prior of $p(x) = \mathcal{N}(x \mid x\supf, P\supf)$ -# - and a likelihood $p(y|x) = \mathcal{N}(y \mid \ObsMod x,R)$, +# - Given the prior of $p(x) = \mathscr{N}(x \mid x\supf, P\supf)$ +# - and a likelihood $p(y|x) = \mathscr{N}(y \mid \ObsMod x,R)$, # - $\implies$ posterior # $ # p(x|y) -# = \mathcal{N}(x \mid x\supa, P\supa) \,, +# = \mathscr{N}(x \mid x\supa, P\supa) \,, # $ # where, in the 1-dimensional/univariate/scalar (multivariate is discussed in [T5](T5%20-%20Multivariate%20Kalman%20filter.ipynb)) case: # @@ -324,7 +324,7 @@ def Bayes2( corr_R =.6, y1=1, R1=4**2, # - (a) Actually derive the first term of the RHS, i.e. eqns. (5) and (6). # *Hint: you can simplify the task by first "hiding" $\ObsMod$ by astutely multiplying by $1$ somewhere.* # - (b) *Optional*: Derive the full RHS (i.e. also the second term). -# - (c) Derive $p(x|y) = \mathcal{N}(x \mid x\supa, P\supa)$ from eqns. (5) and (6) +# - (c) Derive $p(x|y) = \mathscr{N}(x \mid x\supa, P\supa)$ from eqns. (5) and (6) # using part (a), Bayes' rule (BR2), and the Gaussian pdf (G1). # + @@ -333,11 +333,11 @@ def Bayes2( corr_R =.6, y1=1, R1=4**2, # **Exc -- Temperature example:** # The statement $x = \mu \pm \sigma$ is *sometimes* used -# as a shorthand for $p(x) = \mathcal{N}(x \mid \mu, \sigma^2)$. Suppose +# as a shorthand for $p(x) = \mathscr{N}(x \mid \mu, \sigma^2)$. Suppose # - you think the temperature $x = 20°C \pm 2°C$, # - a thermometer yields the observation $y = 18°C \pm 2°C$. # -# Show that your posterior is $p(x|y) = \mathcal{N}(x \mid 19, 2)$ +# Show that your posterior is $p(x|y) = \mathscr{N}(x \mid 19, 2)$ # + # show_answer('GG BR example') diff --git a/notebooks/scripts/T4 - Time series filtering.py b/notebooks/scripts/T4 - Time series filtering.py index 15b4546..bc1707d 100644 --- a/notebooks/scripts/T4 - Time series filtering.py +++ b/notebooks/scripts/T4 - Time series filtering.py @@ -153,7 +153,7 @@ def exprmt(seed=4, nTime=50, M=0.97, logR=1, logQ=1, analyses_only=False, logR_b # Formulae (5) and (6) are called the **forecast step** of the KF. # But when $y_1$ becomes available, according to eqn. (Obs), # then we can update/condition our estimate of $x_1$, i.e. compute the posterior, -# $p(x_1 | y_1) = \mathcal{N}(x_1 \mid x\supa_1, P\supa_1) \,,$ +# $p(x_1 | y_1) = \mathscr{N}(x_1 \mid x\supa_1, P\supa_1) \,,$ # using the formulae we developed for Bayes' rule with # [Gaussian distributions](T3%20-%20Bayesian%20inference.ipynb#Gaussian-Gaussian-Bayes'-rule-(1D)). # diff --git a/notebooks/scripts/T8 - Monte-Carlo & ensembles.py b/notebooks/scripts/T8 - Monte-Carlo & ensembles.py index 89517de..aa80923 100644 --- a/notebooks/scripts/T8 - Monte-Carlo & ensembles.py +++ b/notebooks/scripts/T8 - Monte-Carlo & ensembles.py @@ -77,7 +77,7 @@ def pdf_reconstructions(seed=5, nbins=10, bw=.3): # # **Exc -- Multivariate Gaussian sampling:** # Suppose $\z$ is a standard Gaussian, -# i.e. $p(\z) = \mathcal{N}(\z \mid \bvec{0},\I_{\xDim})$, +# i.e. $p(\z) = \mathscr{N}(\z \mid \bvec{0},\I_{\xDim})$, # where $\I_{\xDim}$ is the $\xDim$-dimensional identity matrix. # Let $\x = \mat{L}\z + \mu$. #