Skip to content

Commit

Permalink
Update notebook
Browse files Browse the repository at this point in the history
  • Loading branch information
tomvothecoder committed Apr 16, 2024
1 parent 22699bb commit 42db452
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions docs/examples/parallel-computing-with-dask.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -105,11 +105,11 @@
" - Data is loaded into memory and **computation** is performed in **streaming fashion**, **block-by-block**\n",
"- Computation is controlled by multi-processing or thread pool\n",
"\n",
"&mdash; <cite>https://docs.xarray.dev/en/stable/user-guide/dask.html</cite>\n",
"\n",
"<div style=\"text-align:center\">\n",
" <img src=\"../_static/dask-array.png\" alt=\"Dask Array\" style=\"display: inline-block; width:300px;\">\n",
"</div>\n",
"\n",
"&mdash; <cite>https://docs.xarray.dev/en/stable/user-guide/dask.html</cite>\n"
"</div>\n"
]
},
{
Expand Down Expand Up @@ -275,7 +275,7 @@
"source": [
"## Using a Dask Cluster for Scalable Computations\n",
"\n",
"- All of the large-scale Dask collections like Dask Array, Dask DataFrame, and Dask Bag and the fine-grained APIs like delayed and futures generate task graphs where each node in the graph is a normal Python function and edges between nodes are normal Python objects that are created by one task as outputs and used as inputs in another task.\n",
"- All of the large-scale Dask collections like Dask Array, Dask DataFrame, and Dask Bag and the fine-grained APIs like delayed and futures **generate task graphs** where each node in the graph is a normal Python function and edges between nodes are normal Python objects that are created by one task as outputs and used as inputs in another task.\n",
"\n",
"- After Dask generates these task graphs, it needs to execute them on parallel hardware. This is the job of a **task scheduler**.\n",
"\n",
Expand Down

0 comments on commit 42db452

Please sign in to comment.