Skip to content

Commit

Permalink
Merge pull request #36 from Project-OMOTES/35-move-orchestrator-db-sc…
Browse files Browse the repository at this point in the history
…hema-to-orchestrator-and-rewire-integration-test-to-use-computation-engine-directly

35: Move orchestrator db schema here and rewire integration test to u…
  • Loading branch information
lfse-slafleur committed Jun 4, 2024
2 parents dd289bc + a75027e commit d1f184b
Show file tree
Hide file tree
Showing 33 changed files with 811 additions and 211 deletions.
25 changes: 25 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,8 @@ jobs:
name: Setup
steps:
- uses: actions/checkout@v3
with:
submodules: 'recursive'
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
Expand All @@ -38,6 +40,8 @@ jobs:
python-version: [ "3.11" ]
steps:
- uses: actions/checkout@v3
with:
submodules: 'recursive'
- name: Restore venv
uses: actions/download-artifact@v4
with:
Expand All @@ -48,6 +52,23 @@ jobs:
run: |
./ci/linux/lint.sh
integration-test:
name: Integration Test
runs-on: ubuntu-latest
needs: [ setup ]
strategy:
fail-fast: false
matrix:
python-version: [ "3.11" ]
steps:
- uses: actions/checkout@v3
with:
submodules: 'recursive'
- name: run integration tests
run: |
cd integration_test
./ci/test_integration.sh
test:
name: Test
runs-on: ubuntu-latest
Expand All @@ -58,6 +79,8 @@ jobs:
python-version: [ "3.11" ]
steps:
- uses: actions/checkout@v3
with:
submodules: 'recursive'
- name: Restore venv
uses: actions/download-artifact@v4
with:
Expand Down Expand Up @@ -101,6 +124,8 @@ jobs:
python-version: [ "3.11" ]
steps:
- uses: actions/checkout@v3
with:
submodules: 'recursive'
- name: Restore venv
uses: actions/download-artifact@v4
with:
Expand Down
3 changes: 3 additions & 0 deletions .gitmodules
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[submodule "computation_engine"]
path = computation-engine-at-orchestrator
url = [email protected]:Project-OMOTES/computation-engine.git
8 changes: 4 additions & 4 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,11 @@ WORKDIR /app
RUN apt update && \
apt install -y libpq-dev python3-dev gcc make # Install dependencies needed for psycopg2

COPY requirements.txt /app/omotes_orchestrator/requirements.txt
RUN pip install -r /app/omotes_orchestrator/requirements.txt --no-cache-dir
COPY requirements.txt /app/requirements.txt
RUN pip install -r /app/requirements.txt --no-cache-dir

COPY src/omotes_orchestrator /app/omotes_orchestrator/
COPY src/ /app/

ENV PYTHONPATH="/app/"

CMD ["python", "-m", "omotes_orchestrator.main"]
CMD ["/app/start.sh"]
104 changes: 103 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,106 @@

This repository is part of the 'Nieuwe Warmte Nu Design Toolkit' project.

Orchestrator component of OMOTES project which monitors workflows and starts the various steps of each workflow.
Orchestrator component of OMOTES project which monitors workflows and starts the various steps of
each workflow.

# Directory structure
The following directory structure is used:

- `ci/`: Contains all CI & other development scripts to help standardize the development workflow
for both Linux and Windows.
- `computation-engine-at-orchestrator/`: Submodule link to the latest `computation-engine` release.
Necessary for infrastructure that is necessary during the integration test.
- `integration_test/`: Contains a large integration test which is run to check for robustness and
stability.
- `src/`: Source code for the orchestrator as well as the necessary database models.
- `unit_test/`: All unit tests for the orchestrator.
- `.dockerignore`: Contains all files and directories which should not be available while building
the docker image.
- `.env-template`: Template `.env` file to run the orchestrator locally outside of docker.
- `.gitignore`: Contains all files and directories which are not kept in Git source control.
- `dev-requirements.txt`: Pinned versions of all development and non-development dependencies.
- `Dockerfile`: The build instructions for building the docker image.
- `pyproject.toml`: The Python project (meta) information.
- `requirements.txt`: Pinned versions of all dependencies needed to run the orchestrator.
- `run.sh`: Script to start the orchestrator locally outside of docker on Linux.
- `run_windows.sh`: Script to start the orchestrator locally outside of docker on Windows.

# Development workflow
The scripts under `ci/` are used to standardize the development proces. The following scripts are
available for Windows (under `ci/win32/` with extension `.cmd`) and Linux (under `ci/linux/` with
extension `.sh).

- `create_venv`: Creates a local virtual environment (`.venv/`) in which all dependencies may be
installed.
- `db_models_apply_schema`: Will apply all available SQL db schema revisions to the local SQL
database.
- `db_models_generate_new_revision`: Can be used to generate a new revision of the SQL db schema.
Expects 1 argument e.g. `ci/linux/db_models_generate_new_revision.sh "this is the revision message`.
- `install_dependencies`: Installs all development and non-development dependencies in the local
virtual environment.
- `lint`: Run the `flake8` to check for linting issues.
- `test_unit`: Run all unit tests under `unit_test/` using `pytest`.
- `typecheck`: Run `mypy` to check the type annotations and look for typing issues.
- `update_dependencies`: Update `dev-requirements.txt` and `requirements.txt` based on the
dependencies specified in `pyproject.toml`

A normal development workflow would be to first use `create_venv` and then finish setting up the
environment using `install_dependencies`. Finally, once all changes are made, you can use `lint`,
`test_unit` and `typecheck` to check for code quality issues.

In case you need to run the orchestrator locally, both `run.sh` and `run_windows.sh` are available
to run the orchestrator bare-metal (without docker).

All these scripts are expected to run from the root of the repository

## Working with computation-engine submodule
The [computation-engine](https://github.com/Project-OMOTES/computation-engine/) is available
as a submodule at `computation-engine-at-orchestrator`. The name of this path is chosen
to make sure starting the `computation-engine` with `docker compose` uses the
`computation-engine-at-orchestrator` project name instead of `computation-engine`. If a developer
is both developing the `orchestrator` and non-submodule `computation-engine` otherwise the
environments may conflict in docker.

To make the submodule available after cloning this repository:
```bash
git submodule update --init
```

Also, checking out a different branch on `orchestrator` may reference a different commit than
the branch you are moving from. So, whenever you checkout a new branch, make sure you run
the command:
```bash
git submodule update --init
```

This will update the reference to point to the correct submodule commit.

## How to work with alembic to make database revisions
First set up the development environment with `create_venv` and `install_dependencies`. Then you
can make the necessary changes to `omotes_orchestrator/db_models/`. Finally, a new SQL schema
revision may be generated using `alembic` by running `db_models_generate_new_revision`. In order to apply
all database revisions you can run `db_models_apply_schema`.

Do not forget to actually start the PostgreSQL database locally!
This may be done with:
```bash
cd computation-engine-at-orchestrator/
cp .env-template .env
./scripts/setup.sh
./scripts/start_postgres_in_dev_mode.sh # This will start PostgreSQL with port 5432 opened on localhost
cd ../
./ci/linux/db_models_apply_schema.sh # Setup will not apply the current schema but only create the SQL database.
```

## Direct Alembic control
In case more control is necessary, you can run the necessary alembic commands directly after
activating the virtual environment (Linux: `. ./.venv/bin/activate`,
Windows: `call venv\Scripts\activate.bat`).

First, change directory: `cd src/`

- Make a revision: `alembic revision --autogenerate -m "<some message>"`
- Perform all revisions: `alembic upgrade head`
- Downgrade to a revision: `alembic downgrade <revision>` (revision 'base' to
undo everything.)
9 changes: 9 additions & 0 deletions ci/linux/db_models_apply_schema.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
#!/bin/bash

if [[ "$OSTYPE" != "win32" && "$OSTYPE" != "msys" ]]; then
echo "Activating .venv first."
. .venv/bin/activate
fi

cd src/
alembic upgrade head
9 changes: 9 additions & 0 deletions ci/linux/db_models_generate_new_revision.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
#!/bin/bash

if [[ "$OSTYPE" != "win32" && "$OSTYPE" != "msys" ]]; then
echo "Activating .venv first."
. .venv/bin/activate
fi

cd src/
alembic revision --autogenerate -m "$1"
7 changes: 7 additions & 0 deletions ci/win32/db_models_apply_schema.cmd
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@

pushd .
cd /D "%~dp0"
cd ..\..\
cd src\
alembic upgrade head
popd
7 changes: 7 additions & 0 deletions ci/win32/db_models_generate_new_revision.cmd
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@

pushd .
cd /D "%~dp0"
cd ..\..\
cd src\
alembic revision --autogenerate -m %1
popd
1 change: 1 addition & 0 deletions computation-engine-at-orchestrator
7 changes: 7 additions & 0 deletions dev-requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,10 @@ aiormq==6.7.7
# via
# -c requirements.txt
# aio-pika
alembic==1.13.1
# via
# -c requirements.txt
# orchestrator (pyproject.toml)
amqp==5.2.0
# via
# -c requirements.txt
Expand Down Expand Up @@ -84,6 +88,7 @@ kombu==5.3.4
mako==1.3.2
# via
# -c requirements.txt
# alembic
# pdoc3
markdown==3.5.2
# via
Expand Down Expand Up @@ -179,6 +184,7 @@ snowballstemmer==2.2.0
sqlalchemy[mypy]==2.0.28
# via
# -c requirements.txt
# alembic
# orchestrator (pyproject.toml)
streamcapture==1.2.2
# via
Expand All @@ -191,6 +197,7 @@ types-protobuf==4.24.0.20240302
typing-extensions==4.8.0
# via
# -c requirements.txt
# alembic
# mypy
# sqlalchemy
tzdata==2023.3
Expand Down
28 changes: 2 additions & 26 deletions integration_test/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,30 +3,6 @@ Submits a large number of jobs from multiples processes/SDKs to check if all of
succeed successfully.

# Setup and run the test
First, ensure that the `computation_engine` repository is available at the same level as
`orchestrator`.
First, ensure that the `computation-engine-at-orchestrator` repository is available as a submodule.

1. Make sure `.env` is available with all env vars:
```bash
POSTGRES_ROOT_USER=root
POSTGRES_ROOT_PASSWORD=1234
POSTGRES_DEV_PORT=6432
POSTGRES_ORCHESTRATOR_USER_NAME=omotes_orchestrator
POSTGRES_ORCHESTRATOR_USER_PASSWORD=somepass3

RABBITMQ_ROOT_USER=root
RABBITMQ_ROOT_PASSWORD=5678
RABBITMQ_HIPE_COMPILE=1
RABBITMQ_EXCHANGE=nwn
RABBITMQ_OMOTES_USER_NAME=omotes
RABBITMQ_OMOTES_USER_PASSWORD=somepass1
RABBITMQ_CELERY_USER_NAME=celery
RABBITMQ_CELERY_USER_PASSWORD=somepass2
```

2. Run `./setup.sh`
3. Run `docker compose up --build`
4. To setup the environment to submit all the jobs:
- `python3.11 -m venv ./.venv/`
- `pip install -r ./requirements.txt`
4. Run `run.sh` to start the test.
1. Run `./ci/test_integration.sh` to setup and start the test.
15 changes: 15 additions & 0 deletions integration_test/ci/_config.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
#!/bin/bash

CURRENT_WORKDIR=$PWD
COMPUTATION_ENGINE="../computation-engine-at-orchestrator"
ENV_FILE="${CURRENT_WORKDIR}/.env.test"
DOCKER_COMPOSE_FILE="${COMPUTATION_ENGINE}/docker-compose.yml"
DOCKER_COMPOSE_OVERRIDE_FILE="./docker-compose.override.yml"

export COMPOSE_PROJECT_NAME=omotes_orchestrator_integration_tests

export ORCHESTRATOR_DIR="${CURRENT_WORKDIR}/../"
export TEST_WORKER_DIR="${CURRENT_WORKDIR}/test_worker/"
export INTEGRATION_TESTS_DIR="${CURRENT_WORKDIR}/integration_tests/"

echo "Using docker compose files: ${DOCKER_COMPOSE_FILE} ${DOCKER_COMPOSE_OVERRIDE_FILE}"
5 changes: 5 additions & 0 deletions integration_test/ci/remove_test_integration.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
#!/bin/bash

. ci/_config.sh

docker compose -f ${DOCKER_COMPOSE_FILE} -f ${DOCKER_COMPOSE_OVERRIDE_FILE} --env-file ${ENV_FILE} down -v
13 changes: 13 additions & 0 deletions integration_test/ci/test_integration.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
#!/bin/bash

. ci/_config.sh

cp ${COMPUTATION_ENGINE}/.env-template ${ENV_FILE}
sed -i 's/LOG_LEVEL=[a-z]*/LOG_LEVEL=WARNING/gi' ${ENV_FILE}

docker compose -f ${DOCKER_COMPOSE_FILE} -f ${DOCKER_COMPOSE_OVERRIDE_FILE} --env-file ${ENV_FILE} down -v

${COMPUTATION_ENGINE}/scripts/setup_orchestrator_postgres_db.sh ${ENV_FILE} ${DOCKER_COMPOSE_FILE}
${COMPUTATION_ENGINE}/scripts/setup_rabbitmq.sh ${ENV_FILE} ${DOCKER_COMPOSE_FILE}

docker compose -f ${DOCKER_COMPOSE_FILE} -f ${DOCKER_COMPOSE_OVERRIDE_FILE} --env-file ${ENV_FILE} up --build --abort-on-container-exit integration_tests orchestrator test_worker
43 changes: 43 additions & 0 deletions integration_test/docker-compose.override.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
version: "3.8"

networks:
omotes:
external: true

services:
orchestrator:
build: ${ORCHESTRATOR_DIR}
image: !reset

test_worker:
build: ${TEST_WORKER_DIR}
deploy:
replicas: 3
networks:
- omotes
environment:
RABBITMQ_HOSTNAME: rabbitmq-nwn
RABBITMQ_PORT: 5672
RABBITMQ_USERNAME: ${RABBITMQ_CELERY_USER_NAME}
RABBITMQ_PASSWORD: ${RABBITMQ_CELERY_USER_PASSWORD}
RABBITMQ_VIRTUALHOST: omotes_celery
LOG_LEVEL: ${LOG_LEVEL}

integration_tests:
build: ${INTEGRATION_TESTS_DIR}
networks:
- omotes
depends_on:
rabbitmq:
condition: service_healthy
omotes_influxdb:
condition: service_healthy
orchestrator:
condition: service_started
environment:
RABBITMQ_OMOTES_USER_NAME: ${RABBITMQ_OMOTES_USER_NAME}
RABBITMQ_OMOTES_USER_PASSWORD: ${RABBITMQ_OMOTES_USER_PASSWORD}
RABBITMQ_VIRTUALHOST: omotes
RABBITMQ_HOST: rabbitmq
RABBITMQ_PORT: 5672
LOG_LEVEL: ${LOG_LEVEL}
Loading

0 comments on commit d1f184b

Please sign in to comment.