Skip to content

Commit

Permalink
Merge pull request #468 from punchagan/readme-updates
Browse files Browse the repository at this point in the history
Updates to the README and minor local workflow improvements
  • Loading branch information
shakthimaan authored Feb 2, 2024
2 parents 13ab71d + 23de8f3 commit 5b97f1f
Show file tree
Hide file tree
Showing 4 changed files with 65 additions and 93 deletions.
11 changes: 4 additions & 7 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -36,8 +36,7 @@ jobs:

- name: Install dependencies
run: |
sudo apt-get update && sudo apt-get -y install wget pkg-config libgmp-dev m4 libdw-dev jq python3-pip autoconf jo libgmp-dev libopenblas-dev liblapacke-dev zlib1g-dev
pip3 install intervaltree
sudo apt-get update && make install-depends
# Runs a set of commands using the runners shell
- name: 5.3.0+trunk+serial
Expand Down Expand Up @@ -93,8 +92,7 @@ jobs:

- name: Install dependencies
run: |
sudo apt-get update && sudo apt-get -y install wget pkg-config libgmp-dev m4 libdw-dev jq python3-pip autoconf jo libgmp-dev libopenblas-dev liblapacke-dev zlib1g-dev
pip3 install intervaltree
sudo apt-get update && make install-depends
# Runs a set of commands using the runners shell
- name: 5.2.0+trunk+serial
Expand Down Expand Up @@ -147,8 +145,7 @@ jobs:

- name: Install dependencies
run: |
sudo apt-get update && sudo apt-get -y install wget pkg-config libgmp-dev m4 libdw-dev jq python3-pip autoconf jo libgmp-dev libopenblas-dev liblapacke-dev zlib1g-dev
pip3 install intervaltree
sudo apt-get update && make install-depends
- name: 4.14.0+serial
run: |
Expand Down Expand Up @@ -187,7 +184,7 @@ jobs:

- name: test_notebooks
run: |
sudo apt-get update && sudo apt-get -y install wget pkg-config libgmp-dev m4 libdw-dev jq python3-pip jo libgmp-dev
sudo apt-get update && make install-depends
python3 -m pip install markupsafe==2.0.1
export PATH=$PATH:/home/opam/.local/bin
pip3 install jupyter nbconvert seaborn==0.11.2 pandas==1.5.3 numpy==1.23.5
Expand Down
10 changes: 6 additions & 4 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,12 @@
You can add new benchmarks as follows:

- **Add dependencies to packages:** If there are any package dependencies your
benchmark has that are not already included in Sandmark, add its opam file to
`dependencies/packages/<package-name>/<package-version>/opam`. If the package
depends on other packages, repeat this step for all of those packages. Add
the package to `PACKAGES` variable in the Makefile.
benchmark has that are not already included in Sandmark, you can add it as a
dependency to `dependencies/template/dev-*.opam`. If you need to apply any
patches to the dependency, add its opam file to
`dependencies/packages/<package-name>/<package-version>/opam`, and the patch
to the `dependencies/packages/<package-name>/<package-version>/files/`
directory.

- **Add benchmark files:** Find a relevant folder in `benchmarks/` and add your
code to it. Feel free to create a new folder if you don't find any existing
Expand Down
5 changes: 4 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ else
endif

DEPENDENCIES = libgmp-dev libdw-dev libopenblas-dev liblapacke-dev zlib1g-dev jq jo python3-pip pkg-config m4 autoconf # Ubuntu
PIP_DEPENDENCIES = intervaltree
PIP_DEPENDENCIES =

.SECONDARY:
export OPAMROOT=$(CURDIR)/_opam
Expand Down Expand Up @@ -380,6 +380,9 @@ depend: check_url load_check
$(foreach d, $(DEPENDENCIES), $(call check_dependency, $(d), dpkg -l, Install on Ubuntu using apt.))
$(foreach d, $(PIP_DEPENDENCIES), $(call check_dependency, $(d), pip3 list --format=columns, Install using pip3 install.))

install-depends:
sudo apt-get install --no-install-recommends --assume-yes $(DEPENDENCIES)

check-parallel/%:
$(eval CONFIG_SWITCH_NAME = $*)
@{ if [[ $(BUILD_BENCH_TARGET) =~ multibench* && $(CONFIG_SWITCH_NAME) =~ 4.14* ]]; then \
Expand Down
132 changes: 51 additions & 81 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ different compiler variants, run and visualise the results.

Sandmark includes both sequential and parallel benchmarks. The results from the
nightly benchmark runs are available at
[sandmark.ocamllabs.io](https://sandmark.ocamllabs.io).
[sandmark.tarides.com](https://sandmark.tarides.com).

## 📣 Attention Users 🫵

Expand All @@ -17,47 +17,62 @@ config](https://github.com/ocaml-bench/sandmark-nightly-config#adding-your-compi
on if you are interested in setting up your own instance of Sandmark for local
runs.

## Quick Start
# FAQ

On Ubuntu 18.04.4 LTS you can try the following commands:
## How do I run the benchmarks locally?

On Ubuntu 20.04.4 LTS or newer, you can run the following commands:

```bash
$ sudo apt-get install curl git libgmp-dev libdw-dev python3-pip jq jo bubblewrap \
pkg-config m4 unzip
$ pip3 install jupyter seaborn pandas intervaltree
# Clone the repository
$ git clone https://github.com/ocaml-bench/sandmark.git && cd sandmark

# Install dependencies
$ make install-depends

# Install OPAM if not available already
$ sh <(curl -sL https://raw.githubusercontent.com/ocaml/opam/master/shell/install.sh)
$ opam init

$ git clone https://github.com/ocaml-bench/sandmark.git
$ cd sandmark
## You can run all the serial or parallel benchmarks using the respective run_all_*.sh scripts
## You can edit the scripts to change the ocaml-version for which to run the benchmarks

$ bash run_all_serial.sh # Run all serial benchmarks
$ bash run_all_parallel.sh # Run all parallel benchmarks
```

## For 4.14.0+domains
You can now find the results in the `_results/` folder.

$ make ocaml-versions/4.14.0+domains.bench
## How do I add new benchmarks?

## For 5.1.0+trunk
See [CONTRIBUTING.md](./CONTRIBUTING.md)

$ opam pin add -n --yes dune 3.5.0
$ opam install dune
## How do I visualize the benchmark results?

$ TAG='"run_in_ci"' make run_config_filtered.json
$ USE_SYS_DUNE_HACK=1 RUN_CONFIG_JSON=run_config_filtered.json make ocaml-versions/5.1.0+trunk.bench
```
### Local runs

You can now find the results in the `_results/` folder.
1. To visualize the local results, there are a handful of IPython notebooks
available in [notebooks/](./notebooks/), which are maintained on a
best-effort basis. See the [README](./notebooks/README.md) for more
information on how to use them.

## Pre-requisites
2. You can run
[sandmark-nightly](https://github.com/ocaml-bench/sandmark-nightly?tab=readme-ov-file#how-to-run-the-webapp-locally)
locally and visualize the local results directory using the local Sandmark
nighly app.

On GNU/Linux you need to have `libgmp-dev` installed for several of
the benchmarks to work. You also need to have `libdw-dev` installed
for the profiling functionality of orun to work on Linux.
### Nightly production runs

You can run `make depend` that will check for any missing
dependencies.
Sandmark benchmarks are configured to run nightly on [navajo](./nightly_navajo.sh) and
[turing](./nightly_turing.sh). The results for these benchmark runs are available at
[sandmark.tarides.com](https://sandmark.tarides.com).

## Overview
## How are the machines tuned for the benchmarking?

You can find detailed notes on the OS settings for the benchmarking servers
[here](https://github.com/ocaml-bench/ocaml_bench_scripts/?tab=readme-ov-file#notes-on-hardware-and-os-settings-for-linux-benchmarking)

# Overview

Sandmark uses opam, with a static local repository, to build external
libraries and applications. It then builds any sandmark OCaml
Expand All @@ -66,11 +81,11 @@ benchmarks as defined in the `run_config.json`

These stages are implemented in:

- Opam setup: the `Makefile` handles the creation of an opam switch
that builds a custom compiler as specified in the
`ocaml-versions/<version>.var` file. It then installs all the
required packages; these packages are statically defined by their
opam files in the `dependencies` directory.
- Opam setup: the `Makefile` handles the creation of an opam switch that
builds a custom compiler as specified in the `ocaml-versions/<version>.json`
file. It then installs all the required packages; the packages versions are
defined in `dependencies/template/*.opam` files. The dependencies can be
patched or tweaked using `dependencies` directory.

- Runplan: the list of benchmarks which will run along with the
measurement wrapper (e.g. orun or perf) is specified in
Expand All @@ -85,6 +100,7 @@ These stages are implemented in:
`run_config.json` and specified via the `RUN_BENCH_TARGET` variable
passed to the makefile.


## Configuration of the compiler build

The compiler variant and its configuration options can be specified in
Expand Down Expand Up @@ -115,11 +131,12 @@ The various options are described below:

### orun

The orun wrapper is packaged in `orun/`, it collects runtime and OCaml
garbage collector statistics producing output in a JSON format. You
can use orun independently of the sandmark benchmarking suite, by
installing it as an opam pin (e.g. `opam install .` from within
`orun/`).
The orun wrapper is packaged as a separate package
[here](https://opam.ocaml.org/packages/orun/). It collects runtime and OCaml
garbage collector statistics producing output in a JSON format.

You can use orun independently of the sandmark benchmarking suite, by
installing it, e.g. using `opam install orun`.

### Using a directory different than /home

Expand Down Expand Up @@ -269,10 +286,6 @@ repo](https://github.com/ocaml-bench/sandmark-nightly/commits/main), so that
they can be visualized using the [sandmark nightly
UI](https://sandmark.tarides.com/)

### Adding benchmarks

See [CONTRIBUTING.md](./CONTRIBUTING.md)

### Config files

The `*_config.json` files used to build benchmarks
Expand All @@ -291,48 +304,6 @@ The following table marks the benchmarks that are currently not working with any
| 5.0.0+trunk.bench | irmin benchmarks | [sandmark#262](https://github.com/ocaml-bench/sandmark/issues/262) |
| 4.14.0+domains.bench | irmin benchmarks | [sandmark#262](https://github.com/ocaml-bench/sandmark/issues/262) |

## UI

JupyterHub is a multi-user server for hosting Jupyter notebooks. The
Littlest JupyterHub (TLJH) installation is capable of hosting 0-100
users.

The following steps can be used for installation on Ubuntu 18.04.4 LTS:

```bash
$ sudo apt install python3 python3-dev git curl
$ curl https://raw.githubusercontent.com/jupyterhub/the-littlest-jupyterhub/master/bootstrap/bootstrap.py | \
sudo -E python3 - --admin adminuser
```

If you would like to run the the service on a specific port, say
"8082", you need to update the same in /opt/tljh/state/traefix.toml
file.

You can verify that the services are running from:

```bash
$ sudo systemctl status traefik
$ sudo systemctl status jupyterhub
```

By default, the hub login opens at hostname:15001/hub/login, which is
used by the admin user to create user accounts. The users will be able
to login using hostname:8082/user/username/tree.

You can also setup HTTPS using Let's Encrypt with JuptyerHub using the
following steps:

```bash
$ sudo tljh-config set https.enabled true
$ sudo tljh-config set https.letsencrypt.email e-mail
$ sudo tljh-config add-item https.letsencrypt.domains example.domain
$ sudo tljh-config show
$ sudo tljh-config reload proxy
```

Reference: https://tljh.jupyter.org/en/latest/install/custom-server.html

## Multicore Notes

### ctypes
Expand Down Expand Up @@ -366,7 +337,6 @@ work on OS X is to install GNU sed with homebrew and then update the
| OCAML_CONFIG_OPTION | Function that gets the runtime parameters `configure` in `ocaml-versions/*.json` | null string | building compiler and its dependencies |
| OCAML_RUN_PARAM | Function that gets the runtime parameters `run_param` in `ocaml-versions/*.json` | null string | building compiler and its dependencies |
| PACKAGES | List of all the benchmark dependencies in sandmark | ```cpdf conf-pkg-config conf-zlib bigstringaf decompress camlzip menhirLib menhir minilight base stdio dune-private-libs dune-configurator camlimages yojson lwt zarith integers uuidm react ocplib-endian nbcodec checkseum sexplib0 eventlog-tools irmin cubicle conf-findutils index logs mtime ppx_deriving ppx_deriving_yojson ppx_irmin repr ppx_repr irmin-layers irmin-pack ``` | building benchmark |
| PIP_DEPENDENCIES | List of Python dependencies | ```intervaltree``` | building compiler and its dependencies |
| PRE_BENCH_EXEC | Any specific commands that needed to be executed before the benchmark. For eg. `PRE_BENCH_EXEC='taskset --cpu-list 3 setarch uname -m --addr-no-randomize'` | null string | executing benchmark | RUN_BENCH_TARGET | The executable to be used to run the benchmarks | `run_orun` | executing benchmark |
| RUN_BENCH_TARGET | The executable to be used to run the benchmarks | `run_orun` | executing benchmark |
| RUN_CONFIG_JSON | Input file selection that contains the list of benchmarks | `run_config.json` | executing benchmark |
Expand Down

0 comments on commit 5b97f1f

Please sign in to comment.