Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP : Reorganize datamodel scripts #394

Open
wants to merge 30 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
a0ece59
Reorganize datamodel scripts
ponceta Feb 3, 2025
53431fa
Create dumps
ponceta Feb 3, 2025
d2761c6
Move tests as well
ponceta Feb 3, 2025
771a792
Create schemaspy.properties
ponceta Feb 3, 2025
5f2d46f
Update init_db.sh
ponceta Feb 3, 2025
c629615
add pg_ in front of pg_services names to avoid confusion with db names
ponceta Feb 4, 2025
c608828
Add docker-compose file
ponceta Feb 4, 2025
11560e7
Add requirements-psycopg2/3.txt
ponceta Feb 4, 2025
5ee3393
rename init_qwat.sh into setup.sh to avoid confusion with docker init…
ponceta Feb 4, 2025
09df9b8
chmod +x
ponceta Feb 4, 2025
0aec9fe
Add permissions to script execution
ponceta Feb 4, 2025
5ea535b
remove initdb
ponceta Feb 4, 2025
b35a7a6
Use github workspace
ponceta Feb 4, 2025
aa1e49a
Initialize container
ponceta Feb 4, 2025
13b2c54
alter postgres/postgis version
ponceta Feb 4, 2025
b98f175
Check existence of init_db.sh
ponceta Feb 4, 2025
3dd3fae
Update datamodel-create-dumps.yml
ponceta Feb 4, 2025
0d349a2
check with obsolete docker-compose version argument
ponceta Feb 4, 2025
d7a76d5
Update datamodel-create-dumps.yml
ponceta Feb 4, 2025
1e3adb4
Update datamodel-create-dumps.yml
ponceta Feb 4, 2025
6703073
add permissions for tests.sh
ponceta Feb 4, 2025
c8e0797
Update tests.sh
ponceta Feb 4, 2025
1dd2d71
update / chmod +x on .sh files
ponceta Feb 4, 2025
114d9a8
Add chmod+x
ponceta Feb 4, 2025
88b4a5b
fix permissions
3nids Feb 11, 2025
d21be91
remove env
3nids Feb 11, 2025
e3bfc74
Fix importlib instead of deprecated imp
ponceta Feb 17, 2025
8663cdb
Update schemaspy.properties
ponceta Feb 17, 2025
ee93e55
pgsql11 stands for pgsql 11 and later
ponceta Feb 18, 2025
7df61a5
Update schemaspy.properties
ponceta Feb 18, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions .build/travis_before_script.sh
Original file line number Diff line number Diff line change
Expand Up @@ -10,21 +10,21 @@ dbname=postgres
user=postgres
password=postgres

[qwat_prod]
[pg_qwat_prod]
host=localhost
dbname=qwat_prod
user=postgres
password=postgres

[qwat_test]
[pg_qwat_test]
host=localhost
dbname=qwat_test
user=postgres
password=postgres

[qwat_comp]
[pg_qwat_comp]
host=localhost
dbname=qwat_comp
dbname=pg_qwat_comp
user=postgres
password=postgres
EOF
Expand Down
26 changes: 13 additions & 13 deletions .build/travis_script.sh
Original file line number Diff line number Diff line change
Expand Up @@ -23,40 +23,40 @@ CREATE TABLE qwat_table_test_ (qwat_column_test_ text);
EOF

# Restore the 1.2.1 dump in the prod database
pum restore -p qwat_prod qwat_dump.backup
pum restore -p pg_qwat_prod qwat_dump.backup

# Set the baseline for the prod database
pum baseline -p qwat_prod -t qwat_sys.info -d $DELTA_DIRS -b 1.2.1
pum baseline -p pg_qwat_prod -t qwat_sys.info -d $DELTA_DIRS -b 1.2.1

# Run init_qwat.sh to create the last version of qwat db used as the comp database
# Run setup.sh to create the last version of qwat db used as the comp database
echo "::group::Initialize database"
$TRAVIS_BUILD_DIR/init_qwat.sh -p qwat_comp -s 21781 -r -n
psql service=qwat_comp -f $EXTRA_DELTA_FILE
$TRAVIS_BUILD_DIR/setup.sh -p pg_qwat_comp -s 21781 -r -n
psql service=pg_qwat_comp -f $EXTRA_DELTA_FILE
echo "::endgroup::"

# Set the baseline for the comp database
pum baseline -p qwat_comp -t qwat_sys.info -d $DELTA_DIRS -b $VERSION
pum baseline -p pg_qwat_comp -t qwat_sys.info -d $DELTA_DIRS -b $VERSION

# Run test_and_upgrade
echo "::group::Run test and upgrade"
yes | pum test-and-upgrade -pp qwat_prod -pt qwat_test -pc qwat_comp -t qwat_sys.info -d $DELTA_DIRS -f /tmp/qwat_dump -i views rules triggers
yes | pum test-and-upgrade -pp pg_qwat_prod -pt pg_qwat_test -pc pg_qwat_comp -t qwat_sys.info -d $DELTA_DIRS -f /tmp/qwat_dump -i views rules triggers
echo "::endgroup::"

# Run a last check between qwat_prod and qwat_comp
pum check -p1 qwat_prod -p2 qwat_comp -i views rules triggers
# Run a last check between pg_qwat_prod and pg_qwat_comp
pum check -p1 pg_qwat_prod -p2 pg_qwat_comp -i views rules triggers

# Extend qwat_prod with a customization
# Extend pg_qwat_prod with a customization
echo "::group::Extend database with a customization"
$TRAVIS_BUILD_DIR/.build/customizations/sigip/init.sh -p qwat_prod -s 21781
$TRAVIS_BUILD_DIR/.build/customizations/sigip/init.sh -p pg_qwat_prod -s 21781
echo "::endgroup::"

# Run upgrade with customizations/sigip/delta as an extra delta dir
DELTA_DIRS="$DELTA_DIRS $TRAVIS_BUILD_DIR/.build/customizations/sigip/delta"
echo "::group::Run upgrade"
pum upgrade -p qwat_prod -t qwat_sys.info -d $DELTA_DIRS
pum upgrade -p pg_qwat_prod -t qwat_sys.info -d $DELTA_DIRS
echo "::endgroup::"

# New test for upgrade
psql service=qwat_prod -c "DROP TABLE qwat_table_test_"
psql service=pg_qwat_prod -c "DROP TABLE qwat_table_test_"
yes | $TRAVIS_BUILD_DIR/update/upgrade_db.sh -p $PGSERVICEFILE -c -e $TRAVIS_BUILD_DIR/.build/customizations/sigip -t /tmp/qwat.dmp -u
exit 0
16 changes: 8 additions & 8 deletions .deploy/create_release.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ def create_dumps():
'--file', dumpfile,
'--schema', 'qwat_dr',
'--schema', 'qwat_od',
'service=qwat_prod']
'service=pg_qwat_prod']
)
files.append(dumpfile)
print('::endgroup::')
Expand All @@ -71,7 +71,7 @@ def create_dumps():
'--file', dumpfile,
'--schema', 'qwat_dr',
'--schema', 'qwat_od',
'service=qwat_prod']
'service=pg_qwat_prod']
)
files.append(dumpfile)
print('::endgroup::')
Expand All @@ -91,7 +91,7 @@ def create_dumps():
'--verbose',
'--file', dumpfile,
'-N', 'public',
'service=qwat_prod']
'service=pg_qwat_prod']
)
files.append(dumpfile)
print('::endgroup::')
Expand All @@ -108,7 +108,7 @@ def create_dumps():
'--verbose',
'--file', dumpfile,
'-N', 'public',
'service=qwat_prod']
'service=pg_qwat_prod']
)
files.append(dumpfile)
print('::endgroup::')
Expand All @@ -127,7 +127,7 @@ def create_dumps():
'--verbose',
'--file', dumpfile,
'-N', 'public',
'service=qwat_prod']
'service=pg_qwat_prod']
)
files.append(dumpfile)
print('::endgroup::')
Expand All @@ -144,7 +144,7 @@ def create_dumps():
'--verbose',
'--file', dumpfile,
'-N', 'public',
'service=qwat_prod']
'service=pg_qwat_prod']
)
files.append(dumpfile)
print('::endgroup::')
Expand All @@ -164,7 +164,7 @@ def create_dumps():
'--verbose',
'--file', dumpfile,
'--schema', 'qwat_vl',
'service=qwat_prod']
'service=pg_qwat_prod']
)
files.append(dumpfile)
print('::endgroup::')
Expand All @@ -182,7 +182,7 @@ def create_dumps():
'--verbose',
'--file', dumpfile,
'--schema', 'qwat_vl',
'service=qwat_prod']
'service=pg_qwat_prod']
)
files.append(dumpfile)
print('::endgroup::')
Expand Down
11 changes: 11 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
version: 2
updates:
- package-ecosystem: pip
directory: "/"
schedule:
interval: monthly

- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "monthly"
61 changes: 61 additions & 0 deletions .github/workflows/datamodel-create-dumps.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
name: 📦 Datamodel | Create dumps

concurrency:
group: dumps-${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true

on:
push:
branches:
- master
paths:
- datamodel/**
- '.github/workflows/datamodel-create-dumps.yml'
pull_request:
branches:
- master
paths:
- datamodel/**
- '.github/workflows/datamodel-create-dumps.yml'
workflow_dispatch:
workflow_call:


jobs:
datamodel-dumps:
name: Create dumps and schemaspy of datamodel
runs-on: ubuntu-24.04
env:
COMPOSE_PROFILES: schemaspy

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Docker build
run: docker compose up -d --build

- name: Initialize container
run: docker compose exec db init_db.sh wait

- name: Create dumps
run: docker compose exec db /src/datamodel/scripts/create-dumps.py

- name: Schemaspy
run: docker compose run schemaspy

- name: Docker logs
if: failure()
run: docker compose logs db

- uses: actions/upload-artifact@v4
with:
name: datamodel-dumps
path: datamodel/artifacts/
if-no-files-found: error

- uses: actions/upload-artifact@v4
with:
name: datamodel-schemaspy
path: datamodel/schemaspy/
if-no-files-found: error
37 changes: 27 additions & 10 deletions .github/workflows/run_tests.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,8 @@
name: Run tests
name: 🐘 Datamodel | Tests

concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true

on:
push:
Expand All @@ -17,11 +21,12 @@ on:
jobs:

run-tests:
runs-on: ubuntu-latest
name: Run unit tests on datamodel
runs-on: ubuntu-24.04

services:
postgres:
image: postgis/postgis:9.6-2.5
image: postgis/postgis:15-3.5
env:
POSTGRES_DB: qwat_test
POSTGRES_PASSWORD: postgres
Expand All @@ -33,28 +38,40 @@ jobs:
--health-timeout 5s
--health-retries 5
env:
PGSERVICEFILE: ${{github.workspace}}/tests/pg_service.conf
PGSERVICEFILE: ${{github.workspace}}/datamodel/tests/pg_service.conf

steps:

- uses: actions/checkout@v3
- uses: actions/checkout@v4

- name: Install PostgreSQL client
run: |
sudo apt-get update
sudo apt-get install --yes postgresql-client

- name: Install Python dependencies
run: pip install -r requirements.txt
run: pip install -r datamodel/requirements.txt

- name: Grant execute permissions for tests.sh
run: chmod +x datamodel/tests/tests.sh

- name: Run tests
run: tests/tests.sh
run: datamodel/tests/tests.sh

- name: Grant execute permissions for tests_scalability.sh
run: chmod +x datamodel/tests/tests_scalability.sh

- name: Run scalability tests
run: tests/tests_scalability.sh -i 20
run: datamodel/tests/tests_scalability.sh -i 20

- name: Grant execute permissions for tests_scalability_multithread.sh
run: chmod +x datamodel/tests/tests_scalability_multithread.sh

- name: Run scalability tests multithreaded
run: tests/tests_scalability_multithread.sh -i 20
run: datamodel/tests/tests_scalability_multithread.sh -i 20

- name: Grant execute permissions for tests_upgrade.sh
run: chmod +x datamodel/tests/tests_upgrade.sh

- name: Run upgrade tests
run: tests/tests_upgrade.sh
run: datamodel/tests/tests_upgrade.sh
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,5 @@ tmp/
update/test_migration.expected.sql
*.orig
.idea
.env
logfile
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ A full web data model documentation with diagrams and relations is available [he

# Model changelog ([Detailed](https://github.com/qwat/qwat-data-model/releases/))

- v1.4.0 : TODO
- v1.3.6 : Add sia405 mapping fields for interlis Export
- v1.3.5 : Minors typo fixes #314 #315
- v1.3.4 : Remove SIRE from core, add value list for valve nominal diameter
Expand Down Expand Up @@ -40,7 +41,7 @@ Tests are run automatically on commit by github actions.
To run them locally (please refer to `run_tests.yml` for up to date steps):
```sh
# start a dev postgis server
docker run --rm -d -p 5432:5432 -e POSTGRES_DB=qwat_test -e POSTGRES_PASSWORD=postgres --name=qwat_test_db postgis/postgis:9.6-2.5
docker run --rm -d -p 5432:5432 -e POSTGRES_DB=qwat_test -e POSTGRES_PASSWORD=postgres --name=qwat_test_db postgis/postgis:15-3.5

# include the pgservices for test database
cat ./tests/pg_service.conf >> ~/.pg_service.conf
Expand Down
50 changes: 50 additions & 0 deletions datamodel/.docker/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
# requires buildkit (default from docker engine 23.0)

ARG POSTGIS_IMAGE=postgis/postgis:15-3.5

# arm builds are not available with 3.2
#FROM imresamu/postgis-arm64:14-3.2 AS base-arm64
FROM "${POSTGIS_IMAGE}" AS base-arm64

FROM "${POSTGIS_IMAGE}" AS base-amd64

FROM base-$BUILDARCH as common

ARG AUTO_INIT=True
ARG RUN_TEST=False
ARG PSYCOPG_VERSION=3

# System deps (bc + exiftool for testing)
RUN apt-get update && apt-get install -y python3 python3-pip python3-venv libpq-dev wget exiftool bc && apt-get clean

# Add source
ADD . /src
WORKDIR /src

# Python deps
ENV VIRTUAL_ENV=/opt/venv
RUN python3 -m venv $VIRTUAL_ENV
ENV PATH="$VIRTUAL_ENV/bin:$PATH"
RUN pip install -r datamodel/requirements.txt
RUN pip install -r datamodel/requirements-psycopg${PSYCOPG_VERSION}.txt
RUN if [ "${RUN_TEST}" = "True" ]; then pip install -r datamodel/requirements-test.txt; fi

# Configure the postgres connections
RUN printf '[postgres]\ndbname=postgres\nuser=postgres\n' >> /etc/postgresql-common/pg_service.conf
RUN printf '[pg_qwat]\ndbname=qwat\nuser=postgres\n' >> /etc/postgresql-common/pg_service.conf
RUN printf '[pg_qwat_demo]\ndbname=qwat_demo\nuser=postgres\n' >> /etc/postgresql-common/pg_service.conf

RUN chmod +x /src/datamodel/.docker/init_db.sh
ENV PATH="/src/datamodel/.docker:${PATH}"

# Execute the main script on database initialization (zzz to be after postgis init)
RUN if [ "${AUTO_INIT}" = "True" ]; then ln -s /src/datamodel/.docker/init_db.sh /docker-entrypoint-initdb.d/zzz_init_db.sh; fi

# Some defaults
ENV POSTGRES_PASSWORD=postgres
# otherwise psycopg cannot connect
ENV PGSERVICEFILE=/etc/postgresql-common/pg_service.conf

ENV PGSERVICE=pg_qwat

ENV PYTEST_ADDOPTS="--color=yes"
Loading
Loading