diff --git a/docs/site/_docs/api.md b/docs/site/_docs/api.md
index 80f3a5116..7abd118f9 100644
--- a/docs/site/_docs/api.md
+++ b/docs/site/_docs/api.md
@@ -53,7 +53,7 @@ end
```
-source
+source
### # **`Turing.@~`** — *Macro*.
@@ -71,7 +71,7 @@ Example:
```
-source
+source
@@ -93,7 +93,7 @@ Generic interface for implementing inference algorithms. An implementation of an
Turing translates models to chunks that call the modelling functions at specified points. The dispatch is based on the value of a `sampler` variable. To include a new inference algorithm implements the requirements mentioned above in a separate file, then include that file at the end of this one.
-source
+source
### # **`Turing.Gibbs`** — *Type*.
@@ -111,7 +111,7 @@ alg = Gibbs(1000, HMC(1, 0.2, 3, :v1), PG(20, 1, :v2))
```
-source
+source
### # **`Turing.HMC`** — *Type*.
@@ -144,7 +144,7 @@ sample(gdemo([1.5, 2]), HMC(1000, 0.05, 10))
```
-source
+source
### # **`Turing.HMCDA`** — *Type*.
@@ -177,7 +177,7 @@ sample(gdemo([1.5, 2]), HMCDA(1000, 200, 0.65, 0.3))
```
-source
+source
### # **`Turing.IPMCMC`** — *Type*.
@@ -210,7 +210,7 @@ sample(gdemo([1.5, 2]), IPMCMC(100, 100, 4, 2))
```
-source
+source
### # **`Turing.IS`** — *Type*.
@@ -245,7 +245,7 @@ sample(gdemo([1.5, 2]), IS(1000))
```
-source
+source
### # **`Turing.MH`** — *Type*.
@@ -278,7 +278,7 @@ sample(gdemo([1.5, 2]), MH(1000, (:m, (x) -> Normal(x, 0.1)), :s)))
```
-source
+source
### # **`Turing.NUTS`** — *Type*.
@@ -311,7 +311,7 @@ sample(gdemo([1.j_max, 2]), NUTS(1000, 200, 0.6j_max))
```
-source
+source
### # **`Turing.PG`** — *Type*.
@@ -344,7 +344,7 @@ sample(gdemo([1.5, 2]), PG(100, 100))
```
-source
+source
### # **`Turing.PMMH`** — *Type*.
@@ -363,7 +363,7 @@ alg = PMMH(100, SMC(20, :v1), MH(1,(:v2, (x) -> Normal(x, 1))))
```
-source
+source
### # **`Turing.SGHMC`** — *Type*.
@@ -391,7 +391,7 @@ sample(example, SGHMC(1000, 0.01, 0.1))
```
-source
+source
### # **`Turing.SGLD`** — *Type*.
@@ -419,7 +419,7 @@ sample(example, SGLD(1000, 0.5))
```
-source
+source
### # **`Turing.SMC`** — *Type*.
@@ -452,13 +452,69 @@ sample(gdemo([1.5, 2]), SMC(1000))
```
-source
+source
+
+
+
+
+## Data Structures
+
+### # **`Libtask.TArray`** — *Type*.
+
+
+```
+TArray{T}(dims, ...)
+```
+
+Implementation of data structures that automatically perform copy-on-write after task copying.
+
+If current*task is an existing key in `s`, then return `s[current*task]`. Otherwise, return`s[current*task] = s[last*task]`.
+
+Usage:
+
+```julia
+TArray(dim)
+```
+
+Example:
+
+```julia
+ta = TArray(4) # init
+for i in 1:4 ta[i] = i end # assign
+Array(ta) # convert to 4-element Array{Int64,1}: [1, 2, 3, 4]
+```
+
+
+
+
+## Utilities
+
+### # **`Libtask.tzeros`** — *Function*.
+
+
+```
+ tzeros(dims, ...)
+```
+
+Construct a distributed array of zeros. Trailing arguments are the same as those accepted by `TArray`.
+
+```julia
+tzeros(dim)
+```
+
+Example:
+
+```julia
+tz = tzeros(4) # construct
+Array(tz) # convert to 4-element Array{Int64,1}: [0, 0, 0, 0]
+```
## Index
+- [`Libtask.TArray`]({{site.baseurl}}/docs/library/#Libtask.TArray)
- [`Turing.Gibbs`]({{site.baseurl}}/docs/library/#Turing.Gibbs)
- [`Turing.HMC`]({{site.baseurl}}/docs/library/#Turing.HMC)
- [`Turing.HMCDA`]({{site.baseurl}}/docs/library/#Turing.HMCDA)
@@ -472,6 +528,7 @@ sample(gdemo([1.5, 2]), SMC(1000))
- [`Turing.SGLD`]({{site.baseurl}}/docs/library/#Turing.SGLD)
- [`Turing.SMC`]({{site.baseurl}}/docs/library/#Turing.SMC)
- [`Turing.Sampler`]({{site.baseurl}}/docs/library/#Turing.Sampler)
+- [`Libtask.tzeros`]({{site.baseurl}}/docs/library/#Libtask.tzeros)
- [`Turing.@model`]({{site.baseurl}}/docs/library/#Turing.@model)
- [`Turing.@~`]({{site.baseurl}}/docs/library/#Turing.@~)
diff --git a/docs/site/_docs/quick-start.md b/docs/site/_docs/quick-start.md
deleted file mode 100644
index 23285f48d..000000000
--- a/docs/site/_docs/quick-start.md
+++ /dev/null
@@ -1,57 +0,0 @@
----
-title: Probablistic Programming in Thirty Seconds
-permalink: /docs/quick-start/
----
-
-If you are already well-versed in probabalistic programming and just want to take a quick look at how Turing's syntax works or otherwise just want a model to start with, we have provided a Bayesian coin-flipping model to play with.
-
-
-This example can be run on however you have Julia installed (see [Getting Started](get-started.md)), but you will need to install the packages `Turing`, `Distributions`, `MCMCChain`, and `StatPlots` if you have not done so already.
-
-
-This is an excerpt from a more formal example introducing probabalistic programming which can be found in Jupyter notebook form [here](https://github.com/TuringLang/TuringTutorials/blob/master/0_Introduction.ipynb) or as part of the documentation website [here](0_Introduction.md).
-
-
-```julia
-# Import libraries.
-using Turing, Distributions, MCMCChain, StatPlots, Random
-
-# Set the true probability of heads in a coin.
-p_true = 0.5
-
-# Iterate from having seen 0 observations to 100 observations.
-Ns = 0:100;
-
-# Draw data from a Bernoulli distribution, i.e. draw heads or tails.
-Random.seed!(12)
-data = rand(Bernoulli(p_true), last(Ns))
-
-# Here's what the first five coin flips look like:
-data[1:5]
-
-# Declare our Turing model.
-@model coinflip(y) = begin
- # Our prior belief about the probability of heads in a coin.
- p ~ Beta(1, 1)
-
- # The number of observations.
- N = length(y)
- for n in 1:N
- # Heads or tails of a coin are drawn from a Bernoulli distribution.
- y[n] ~ Bernoulli(p)
- end
-end;
-
-# Settings of the Hamiltonian Monte Carlo (HMC) sampler.
-iterations = 1000
-ϵ = 0.05
-τ = 10
-
-# Start sampling.
-chain = sample(coinflip(data), HMC(iterations, ϵ, τ));
-
-# Construct summary of the sampling process for the parameter p, i.e. the probability of heads in a coin.
-p_summary = Chains(chain[:p])
-histogramplot(p_summary)
-```
-
diff --git a/docs/site/_includes/feature_row b/docs/site/_includes/feature_row
index 89dfc1b47..46037c8b8 100644
--- a/docs/site/_includes/feature_row
+++ b/docs/site/_includes/feature_row
@@ -31,6 +31,17 @@
{% endif %}
+ {% if f.i_class %}
+
+