From 4fb9d0ecad6671d39c6b98bf3bf1332e2cee4510 Mon Sep 17 00:00:00 2001 From: Puneeth Chaganti Date: Fri, 2 Feb 2024 13:01:25 +0530 Subject: [PATCH 1/7] Update URL to sandmark.tarides.com --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index bb26ec6d9..2ab8e5deb 100644 --- a/README.md +++ b/README.md @@ -7,7 +7,7 @@ different compiler variants, run and visualise the results. Sandmark includes both sequential and parallel benchmarks. The results from the nightly benchmark runs are available at -[sandmark.ocamllabs.io](https://sandmark.ocamllabs.io). +[sandmark.tarides.com](https://sandmark.tarides.com). ## 📣 Attention Users 🫵 From 2418e01ef57c2a6b8bc626a6d4ae5a804929ccb4 Mon Sep 17 00:00:00 2001 From: Puneeth Chaganti Date: Fri, 2 Feb 2024 13:10:32 +0530 Subject: [PATCH 2/7] Remove the dependency on intervaltree for the benchmark runs --- .github/workflows/main.yml | 3 --- Makefile | 2 +- README.md | 2 -- 3 files changed, 1 insertion(+), 6 deletions(-) diff --git a/.github/workflows/main.yml b/.github/workflows/main.yml index 07ddec8be..07e3dc132 100644 --- a/.github/workflows/main.yml +++ b/.github/workflows/main.yml @@ -37,7 +37,6 @@ jobs: - name: Install dependencies run: | sudo apt-get update && sudo apt-get -y install wget pkg-config libgmp-dev m4 libdw-dev jq python3-pip autoconf jo libgmp-dev libopenblas-dev liblapacke-dev zlib1g-dev - pip3 install intervaltree # Runs a set of commands using the runners shell - name: 5.3.0+trunk+serial @@ -94,7 +93,6 @@ jobs: - name: Install dependencies run: | sudo apt-get update && sudo apt-get -y install wget pkg-config libgmp-dev m4 libdw-dev jq python3-pip autoconf jo libgmp-dev libopenblas-dev liblapacke-dev zlib1g-dev - pip3 install intervaltree # Runs a set of commands using the runners shell - name: 5.2.0+trunk+serial @@ -148,7 +146,6 @@ jobs: - name: Install dependencies run: | sudo apt-get update && sudo apt-get -y install wget pkg-config libgmp-dev m4 libdw-dev jq python3-pip autoconf jo libgmp-dev libopenblas-dev liblapacke-dev zlib1g-dev - pip3 install intervaltree - name: 4.14.0+serial run: | diff --git a/Makefile b/Makefile index a264d46b6..e4b9c2724 100644 --- a/Makefile +++ b/Makefile @@ -77,7 +77,7 @@ else endif DEPENDENCIES = libgmp-dev libdw-dev libopenblas-dev liblapacke-dev zlib1g-dev jq jo python3-pip pkg-config m4 autoconf # Ubuntu -PIP_DEPENDENCIES = intervaltree +PIP_DEPENDENCIES = .SECONDARY: export OPAMROOT=$(CURDIR)/_opam diff --git a/README.md b/README.md index 2ab8e5deb..9c3152e4e 100644 --- a/README.md +++ b/README.md @@ -24,7 +24,6 @@ On Ubuntu 18.04.4 LTS you can try the following commands: ```bash $ sudo apt-get install curl git libgmp-dev libdw-dev python3-pip jq jo bubblewrap \ pkg-config m4 unzip -$ pip3 install jupyter seaborn pandas intervaltree # Install OPAM if not available already $ sh <(curl -sL https://raw.githubusercontent.com/ocaml/opam/master/shell/install.sh) @@ -366,7 +365,6 @@ work on OS X is to install GNU sed with homebrew and then update the | OCAML_CONFIG_OPTION | Function that gets the runtime parameters `configure` in `ocaml-versions/*.json` | null string | building compiler and its dependencies | | OCAML_RUN_PARAM | Function that gets the runtime parameters `run_param` in `ocaml-versions/*.json` | null string | building compiler and its dependencies | | PACKAGES | List of all the benchmark dependencies in sandmark | ```cpdf conf-pkg-config conf-zlib bigstringaf decompress camlzip menhirLib menhir minilight base stdio dune-private-libs dune-configurator camlimages yojson lwt zarith integers uuidm react ocplib-endian nbcodec checkseum sexplib0 eventlog-tools irmin cubicle conf-findutils index logs mtime ppx_deriving ppx_deriving_yojson ppx_irmin repr ppx_repr irmin-layers irmin-pack ``` | building benchmark | -| PIP_DEPENDENCIES | List of Python dependencies | ```intervaltree``` | building compiler and its dependencies | | PRE_BENCH_EXEC | Any specific commands that needed to be executed before the benchmark. For eg. `PRE_BENCH_EXEC='taskset --cpu-list 3 setarch uname -m --addr-no-randomize'` | null string | executing benchmark | RUN_BENCH_TARGET | The executable to be used to run the benchmarks | `run_orun` | executing benchmark | | RUN_BENCH_TARGET | The executable to be used to run the benchmarks | `run_orun` | executing benchmark | | RUN_CONFIG_JSON | Input file selection that contains the list of benchmarks | `run_config.json` | executing benchmark | From 1a15b227768ab1fd066237783eff50e7bd47fb09 Mon Sep 17 00:00:00 2001 From: Puneeth Chaganti Date: Fri, 2 Feb 2024 13:55:41 +0530 Subject: [PATCH 3/7] Add a make target to install system dependencies --- .github/workflows/main.yml | 8 ++++---- Makefile | 3 +++ 2 files changed, 7 insertions(+), 4 deletions(-) diff --git a/.github/workflows/main.yml b/.github/workflows/main.yml index 07e3dc132..d60a2fd50 100644 --- a/.github/workflows/main.yml +++ b/.github/workflows/main.yml @@ -36,7 +36,7 @@ jobs: - name: Install dependencies run: | - sudo apt-get update && sudo apt-get -y install wget pkg-config libgmp-dev m4 libdw-dev jq python3-pip autoconf jo libgmp-dev libopenblas-dev liblapacke-dev zlib1g-dev + sudo apt-get update && make install-depends # Runs a set of commands using the runners shell - name: 5.3.0+trunk+serial @@ -92,7 +92,7 @@ jobs: - name: Install dependencies run: | - sudo apt-get update && sudo apt-get -y install wget pkg-config libgmp-dev m4 libdw-dev jq python3-pip autoconf jo libgmp-dev libopenblas-dev liblapacke-dev zlib1g-dev + sudo apt-get update && make install-depends # Runs a set of commands using the runners shell - name: 5.2.0+trunk+serial @@ -145,7 +145,7 @@ jobs: - name: Install dependencies run: | - sudo apt-get update && sudo apt-get -y install wget pkg-config libgmp-dev m4 libdw-dev jq python3-pip autoconf jo libgmp-dev libopenblas-dev liblapacke-dev zlib1g-dev + sudo apt-get update && make install-depends - name: 4.14.0+serial run: | @@ -184,7 +184,7 @@ jobs: - name: test_notebooks run: | - sudo apt-get update && sudo apt-get -y install wget pkg-config libgmp-dev m4 libdw-dev jq python3-pip jo libgmp-dev + sudo apt-get update && make install-depends python3 -m pip install markupsafe==2.0.1 export PATH=$PATH:/home/opam/.local/bin pip3 install jupyter nbconvert seaborn==0.11.2 pandas==1.5.3 numpy==1.23.5 diff --git a/Makefile b/Makefile index e4b9c2724..f088cf74b 100644 --- a/Makefile +++ b/Makefile @@ -380,6 +380,9 @@ depend: check_url load_check $(foreach d, $(DEPENDENCIES), $(call check_dependency, $(d), dpkg -l, Install on Ubuntu using apt.)) $(foreach d, $(PIP_DEPENDENCIES), $(call check_dependency, $(d), pip3 list --format=columns, Install using pip3 install.)) +install-depends: + sudo apt-get install --no-install-recommends --assume-yes $(DEPENDENCIES) + check-parallel/%: $(eval CONFIG_SWITCH_NAME = $*) @{ if [[ $(BUILD_BENCH_TARGET) =~ multibench* && $(CONFIG_SWITCH_NAME) =~ 4.14* ]]; then \ From 04efb733e62e54d0f0c04c18dd6caec1506da40a Mon Sep 17 00:00:00 2001 From: Puneeth Chaganti Date: Fri, 2 Feb 2024 14:01:43 +0530 Subject: [PATCH 4/7] readme: Clean local benchmarking run instructions - Switch to using the install-depends make target - Recommend using the run_all_*.sh scripts, instead of manual commands --- README.md | 37 +++++++++++-------------------------- 1 file changed, 11 insertions(+), 26 deletions(-) diff --git a/README.md b/README.md index 9c3152e4e..fb5f2e67c 100644 --- a/README.md +++ b/README.md @@ -17,45 +17,30 @@ config](https://github.com/ocaml-bench/sandmark-nightly-config#adding-your-compi on if you are interested in setting up your own instance of Sandmark for local runs. -## Quick Start +## How do I run the benchmarks locally? -On Ubuntu 18.04.4 LTS you can try the following commands: +On Ubuntu 20.04.4 LTS or newer, you can run the following commands: ```bash -$ sudo apt-get install curl git libgmp-dev libdw-dev python3-pip jq jo bubblewrap \ - pkg-config m4 unzip +# Clone the repository +$ git clone https://github.com/ocaml-bench/sandmark.git && cd sandmark + +# Install dependencies +$ make install-depends # Install OPAM if not available already $ sh <(curl -sL https://raw.githubusercontent.com/ocaml/opam/master/shell/install.sh) $ opam init -$ git clone https://github.com/ocaml-bench/sandmark.git -$ cd sandmark - -## For 4.14.0+domains - -$ make ocaml-versions/4.14.0+domains.bench - -## For 5.1.0+trunk +## You can run all the serial or parallel benchmarks using the respective run_all_*.sh scripts +## You can edit the scripts to change the ocaml-version for which to run the benchmarks -$ opam pin add -n --yes dune 3.5.0 -$ opam install dune - -$ TAG='"run_in_ci"' make run_config_filtered.json -$ USE_SYS_DUNE_HACK=1 RUN_CONFIG_JSON=run_config_filtered.json make ocaml-versions/5.1.0+trunk.bench +$ bash run_all_serial.sh # Run all serial benchmarks +$ bash run_all_parallel.sh # Run all parallel benchmarks ``` You can now find the results in the `_results/` folder. -## Pre-requisites - -On GNU/Linux you need to have `libgmp-dev` installed for several of -the benchmarks to work. You also need to have `libdw-dev` installed -for the profiling functionality of orun to work on Linux. - -You can run `make depend` that will check for any missing -dependencies. - ## Overview Sandmark uses opam, with a static local repository, to build external From 4ab76b7b27725615cbf893278beb65cc226e1d45 Mon Sep 17 00:00:00 2001 From: Puneeth Chaganti Date: Fri, 2 Feb 2024 14:31:02 +0530 Subject: [PATCH 5/7] readme: Add an FAQ before the detailed overview of the benchmarks --- README.md | 59 +++++++++++++++++++++++++++++++++++++++++-------------- 1 file changed, 44 insertions(+), 15 deletions(-) diff --git a/README.md b/README.md index fb5f2e67c..afb78e720 100644 --- a/README.md +++ b/README.md @@ -17,6 +17,8 @@ config](https://github.com/ocaml-bench/sandmark-nightly-config#adding-your-compi on if you are interested in setting up your own instance of Sandmark for local runs. +# FAQ + ## How do I run the benchmarks locally? On Ubuntu 20.04.4 LTS or newer, you can run the following commands: @@ -41,7 +43,36 @@ $ bash run_all_parallel.sh # Run all parallel benchmarks You can now find the results in the `_results/` folder. -## Overview +## How do I add new benchmarks? + +See [CONTRIBUTING.md](./CONTRIBUTING.md) + +## How do I visualize the benchmark results? + +### Local runs + +1. To visualize the local results, there are a handful of IPython notebooks + available in [notebooks/](./notebooks/), which are maintained on a + best-effort basis. See the [README](./notebooks/README.md) for more + information on how to use them. + +2. You can run + [sandmark-nightly](https://github.com/ocaml-bench/sandmark-nightly?tab=readme-ov-file#how-to-run-the-webapp-locally) + locally and visualize the local results directory using the local Sandmark + nighly app. + +### Nightly production runs + +Sandmark benchmarks are configured to run nightly on [navajo](./nightly_navajo.sh) and +[turing](./nightly_turing.sh). The results for these benchmark runs are available at +[sandmark.tarides.com](https://sandmark.tarides.com). + +## How are the machines tuned for the benchmarking? + +You can find detailed notes on the OS settings for the benchmarking servers +[here](https://github.com/ocaml-bench/ocaml_bench_scripts/?tab=readme-ov-file#notes-on-hardware-and-os-settings-for-linux-benchmarking) + +# Overview Sandmark uses opam, with a static local repository, to build external libraries and applications. It then builds any sandmark OCaml @@ -50,11 +81,11 @@ benchmarks as defined in the `run_config.json` These stages are implemented in: - - Opam setup: the `Makefile` handles the creation of an opam switch - that builds a custom compiler as specified in the - `ocaml-versions/.var` file. It then installs all the - required packages; these packages are statically defined by their - opam files in the `dependencies` directory. + - Opam setup: the `Makefile` handles the creation of an opam switch that + builds a custom compiler as specified in the `ocaml-versions/.json` + file. It then installs all the required packages; the packages versions are + defined in `dependencies/template/*.opam` files. The dependencies can be + patched or tweaked using `dependencies` directory. - Runplan: the list of benchmarks which will run along with the measurement wrapper (e.g. orun or perf) is specified in @@ -69,6 +100,7 @@ These stages are implemented in: `run_config.json` and specified via the `RUN_BENCH_TARGET` variable passed to the makefile. + ## Configuration of the compiler build The compiler variant and its configuration options can be specified in @@ -99,11 +131,12 @@ The various options are described below: ### orun -The orun wrapper is packaged in `orun/`, it collects runtime and OCaml -garbage collector statistics producing output in a JSON format. You -can use orun independently of the sandmark benchmarking suite, by -installing it as an opam pin (e.g. `opam install .` from within -`orun/`). +The orun wrapper is packaged as a separate package +[here](https://opam.ocaml.org/packages/orun/). It collects runtime and OCaml +garbage collector statistics producing output in a JSON format. + +You can use orun independently of the sandmark benchmarking suite, by +installing it, e.g. using `opam install orun`. ### Using a directory different than /home @@ -253,10 +286,6 @@ repo](https://github.com/ocaml-bench/sandmark-nightly/commits/main), so that they can be visualized using the [sandmark nightly UI](https://sandmark.tarides.com/) -### Adding benchmarks - -See [CONTRIBUTING.md](./CONTRIBUTING.md) - ### Config files The `*_config.json` files used to build benchmarks From b86675d73c5e19868a74c56f9761b5339b237be2 Mon Sep 17 00:00:00 2001 From: Puneeth Chaganti Date: Fri, 2 Feb 2024 14:31:50 +0530 Subject: [PATCH 6/7] readme: Remove outdated UI setup documentation --- README.md | 42 ------------------------------------------ 1 file changed, 42 deletions(-) diff --git a/README.md b/README.md index afb78e720..b0d48b763 100644 --- a/README.md +++ b/README.md @@ -304,48 +304,6 @@ The following table marks the benchmarks that are currently not working with any | 5.0.0+trunk.bench | irmin benchmarks | [sandmark#262](https://github.com/ocaml-bench/sandmark/issues/262) | | 4.14.0+domains.bench | irmin benchmarks | [sandmark#262](https://github.com/ocaml-bench/sandmark/issues/262) | -## UI - -JupyterHub is a multi-user server for hosting Jupyter notebooks. The -Littlest JupyterHub (TLJH) installation is capable of hosting 0-100 -users. - -The following steps can be used for installation on Ubuntu 18.04.4 LTS: - -```bash -$ sudo apt install python3 python3-dev git curl -$ curl https://raw.githubusercontent.com/jupyterhub/the-littlest-jupyterhub/master/bootstrap/bootstrap.py | \ - sudo -E python3 - --admin adminuser -``` - -If you would like to run the the service on a specific port, say -"8082", you need to update the same in /opt/tljh/state/traefix.toml -file. - -You can verify that the services are running from: - -```bash -$ sudo systemctl status traefik -$ sudo systemctl status jupyterhub -``` - -By default, the hub login opens at hostname:15001/hub/login, which is -used by the admin user to create user accounts. The users will be able -to login using hostname:8082/user/username/tree. - -You can also setup HTTPS using Let's Encrypt with JuptyerHub using the -following steps: - -```bash -$ sudo tljh-config set https.enabled true -$ sudo tljh-config set https.letsencrypt.email e-mail -$ sudo tljh-config add-item https.letsencrypt.domains example.domain -$ sudo tljh-config show -$ sudo tljh-config reload proxy -``` - -Reference: https://tljh.jupyter.org/en/latest/install/custom-server.html - ## Multicore Notes ### ctypes From 23de8f3fd0d1a367cbe96adb6925a0b6f19cb8b0 Mon Sep 17 00:00:00 2001 From: Puneeth Chaganti Date: Fri, 2 Feb 2024 14:43:27 +0530 Subject: [PATCH 7/7] contributing: Update dependency adding to the latest method --- CONTRIBUTING.md | 10 ++++++---- 1 file changed, 6 insertions(+), 4 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index e228e8c6b..e77c0ae12 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -3,10 +3,12 @@ You can add new benchmarks as follows: - **Add dependencies to packages:** If there are any package dependencies your - benchmark has that are not already included in Sandmark, add its opam file to - `dependencies/packages///opam`. If the package - depends on other packages, repeat this step for all of those packages. Add - the package to `PACKAGES` variable in the Makefile. + benchmark has that are not already included in Sandmark, you can add it as a + dependency to `dependencies/template/dev-*.opam`. If you need to apply any + patches to the dependency, add its opam file to + `dependencies/packages///opam`, and the patch + to the `dependencies/packages///files/` + directory. - **Add benchmark files:** Find a relevant folder in `benchmarks/` and add your code to it. Feel free to create a new folder if you don't find any existing