Skip to content

Commit

Permalink
Migrate to GitHub Actions (#199)
Browse files Browse the repository at this point in the history
  • Loading branch information
jacobtomlinson authored Nov 9, 2020
1 parent 2b9db92 commit a475118
Show file tree
Hide file tree
Showing 3 changed files with 50 additions and 54 deletions.
38 changes: 38 additions & 0 deletions .github/workflows/ci-build.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
name: CI
on: [push, pull_request]

jobs:
build-and-deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout source
uses: actions/checkout@v2

- name: Setup Conda Environment
uses: goanpeca/setup-miniconda@v1
with:
miniconda-version: "latest"
python-version: "3.7"
environment-file: binder/environment.yml
activate-environment: dask-tutorial
auto-activate-base: false

- name: Install testing and docs dependencies
shell: bash -l {0}
run: |
conda install -c conda-forge nbconvert nbformat jupyter_client ipykernel
pip install nbsphinx dask-sphinx-theme sphinx
- name: Build
shell: bash -l {0}
run: |
python prep.py --small
sphinx-build -M html . _build -v
- name: Deploy
if: ${{ github.ref == 'refs/heads/master' && github.event_name != 'pull_request'}}
uses: JamesIves/[email protected]
with:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
BRANCH: gh-pages
FOLDER: _build/html
CLEAN: true
43 changes: 0 additions & 43 deletions .travis.yml

This file was deleted.

23 changes: 12 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ This tutorial was last given at SciPy 2020 which was a virtual conference.
[A video of the SciPy 2020 tutorial is available online](https://www.youtube.com/watch?v=EybGGLbLipI).

[![Binder](https://mybinder.org/badge_logo.svg)](https://mybinder.org/v2/gh/dask/dask-tutorial/master?urlpath=lab)
[![Build Status](https://github.com/dask/dask-tutorial/workflows/CI/badge.svg)](https://github.com/dask/dask-tutorial/actions?query=workflow%3ACI)

Dask provides multi-core execution on larger-than-memory datasets.

Expand Down Expand Up @@ -35,13 +36,13 @@ schedulers (odd sections.)

and then install necessary packages.
There are three different ways to achieve this, pick the one that best suits you, and ***only pick one option***.
They are, in order of preference:
They are, in order of preference:

#### 2a) Create a conda environment (preferred)

In the main repo directory

conda env create -f binder/environment.yml
conda env create -f binder/environment.yml
conda activate dask-tutorial
jupyter labextension install @jupyter-widgets/jupyterlab-manager
jupyter labextension install @bokeh/jupyter_bokeh
Expand All @@ -55,10 +56,10 @@ You will need the following core libraries
You may find the following libraries helpful for some exercises

conda install python-graphviz -c conda-forge
Note that this options will alter your existing environment, potentially changing the versions of packages you already
have installed.

Note that this options will alter your existing environment, potentially changing the versions of packages you already
have installed.

#### 2c) Use Dockerfile

You can build a docker image out of the provided Dockerfile.
Expand All @@ -69,7 +70,7 @@ Run a container, replacing the ID with the output of the previous command

$ docker run -it -p 8888:8888 -p 8787:8787 <container_id_or_tag>

The above command will give an URL (`Like http://(container_id or 127.0.0.1):8888/?token=<sometoken>`) which
The above command will give an URL (`Like http://(container_id or 127.0.0.1):8888/?token=<sometoken>`) which
can be used to access the notebook from browser. You may need to replace the given hostname with "localhost" or
"127.0.0.1".

Expand All @@ -79,7 +80,7 @@ can be used to access the notebook from browser. You may need to replace the giv

From the repo directory

jupyter notebook
jupyter notebook

Or

Expand Down Expand Up @@ -110,8 +111,8 @@ This was already done for method c) and does not need repeating.

2. [Bag](02_bag.ipynb) - the first high-level collection: a generalized iterator for use
with a functional programming style and to clean messy data.
3. [Array](03_array.ipynb) - blocked numpy-like functionality with a collection of

3. [Array](03_array.ipynb) - blocked numpy-like functionality with a collection of
numpy arrays spread across your cluster.

7. [Dataframe](04_dataframe.ipynb) - parallelized operations on many pandas dataframes
Expand All @@ -120,7 +121,7 @@ spread across your cluster.
5. [Distributed](05_distributed.ipynb) - Dask's scheduler for clusters, with details of
how to view the UI.

6. [Advanced Distributed](06_distributed_advanced.ipynb) - further details on distributed
6. [Advanced Distributed](06_distributed_advanced.ipynb) - further details on distributed
computing, including how to debug.

7. [Dataframe Storage](07_dataframe_storage.ipynb) - efficient ways to read and write
Expand Down

0 comments on commit a475118

Please sign in to comment.