Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cherrypicks from18 #1065

Merged
merged 18 commits into from
Aug 23, 2024
Merged
Show file tree
Hide file tree
Changes from 12 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 9 additions & 2 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,10 @@
datacube_wms/wms_cfg_local.py
.pytest_cache
*/__pycache__
**/.pytest_cache
**/__pycache__
.hypothesis

venv
.venv

**/.pixi
.git
1 change: 0 additions & 1 deletion .env_simple
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@ POSTGRES_USER=opendatacubeusername
SERVER_DB_USERNAME=opendatacubeusername
POSTGRES_PASSWORD=opendatacubepassword
POSTGRES_DB="odc_postgres,odc_postgis"
READY_PROBE_DB=odc_postgis

#################
# OWS CFG Config
Expand Down
9 changes: 4 additions & 5 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -48,23 +48,22 @@ jobs:
# the production image
- name: Build dev OWS image
run: |
docker build \
docker build --build-arg ENVIRONMENT=test \
--tag ${ORG}/${IMAGE}:_builder \
.

- name: Test and lint dev OWS image
run: |
mkdir artifacts
docker run -e LOCAL_UID=$(id -u $USER) -e LOCAL_GID=$(id -g $USER) -v ${PWD}/artifacts:/mnt/artifacts ${ORG}/${IMAGE}:_builder /bin/sh -c "cd /code;./check-code.sh"
chmod 777 artifacts
SpacemanPaul marked this conversation as resolved.
Show resolved Hide resolved
docker run -e LOCAL_UID=1000 -e LOCAL_GID=1000 -u ubuntu -v ${PWD}/artifacts:/mnt/artifacts ${ORG}/${IMAGE}:_builder /bin/sh -c "cd /code && ./check-code.sh"
SpacemanPaul marked this conversation as resolved.
Show resolved Hide resolved
mv ./artifacts/coverage.xml ./artifacts/coverage-unit.xml

- name: Dockerized Integration Pytest
run: |
export LOCAL_UID=$(id -u $USER)
export LOCAL_GID=$(id -g $USER)
export $(grep -v '^#' .env_simple | xargs)
docker compose -f docker-compose.yaml -f docker-compose.db.yaml up -d --wait --build
docker compose -f docker-compose.yaml -f docker-compose.db.yaml exec -T ows /bin/sh -c "cd /code;./check-code-all.sh"
docker compose -f docker-compose.yaml -f docker-compose.db.yaml exec -u ubuntu -T ows /bin/sh -c "cd /code && ./check-code-all.sh"
SpacemanPaul marked this conversation as resolved.
Show resolved Hide resolved
docker compose -f docker-compose.yaml -f docker-compose.db.yaml down

- name: Upload All coverage to Codecov
Expand Down
97 changes: 43 additions & 54 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,92 +1,81 @@
# Note that this is now pinned to a fixed version. Remember to check for new versions periodically.
FROM ghcr.io/osgeo/gdal:ubuntu-small-3.8.5 AS builder
FROM ghcr.io/osgeo/gdal:ubuntu-small-3.9.1 AS builder

# Setup build env for postgresql-client-14
# Environment is test or deployment.
ARG ENVIRONMENT=deployment

# Setup build env for postgresql-client-16
USER root
RUN apt-get update -y \
&& DEBIAN_FRONTEND=noninteractive apt-get install -y --fix-missing --no-install-recommends \
git \
# For pybabel
python3-distutils \
# For Psycopg2
libpq-dev python3-dev \
gcc \
python3-pip \
postgresql-client-14 \
postgresql-client-16 \
# For Pyproj build \
proj-bin proj-data libproj-dev \
proj-bin libproj-dev \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* /var/dpkg/* /var/tmp/* /var/log/dpkg.log

ENV GDAL_DISABLE_READDIR_ON_OPEN="EMPTY_DIR"

# Copy source code and install it
WORKDIR /code
COPY . /code

RUN echo "version=\"$(python3 setup.py --version)\"" > datacube_ows/_version.py
RUN pip install --no-cache-dir .[ops,test]

## Only install pydev requirements if arg PYDEV_DEBUG is set to 'yes'
ARG PYDEV_DEBUG="no"
RUN if [ "$PYDEV_DEBUG" = "yes" ]; then \
pip install --no-cache-dir .[dev] \
;fi
WORKDIR /build

RUN pip freeze
RUN python3 -m pip --disable-pip-version-check -q wheel --no-binary psycopg2 psycopg2 \
&& ([ "$ENVIRONMENT" = "deployment" ] || \
python3 -m pip --disable-pip-version-check -q wheel --no-binary pyproj pyproj)

# Should match builder base.
FROM ghcr.io/osgeo/gdal:ubuntu-small-3.8.5
FROM ghcr.io/osgeo/gdal:ubuntu-small-3.9.1

RUN apt-get update -y \
&& DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
# Environment is test or deployment.
ARG ENVIRONMENT=deployment
RUN export DEBIAN_FRONTEND=noninteractive \
&& apt-get update -y \
&& apt-get install -y --no-install-recommends \
git \
gosu \
python3-pip \
tini \
&& ([ "$ENVIRONMENT" = "deployment" ] || \
apt-get install -y --no-install-recommends \
proj-bin) \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* /var/dpkg/* /var/tmp/* /var/log/dpkg.log

# Add login-script for UID/GID-remapping.
COPY --chown=root:root --link docker/files/remap-user.sh /usr/local/bin/remap-user.sh

# all the python pip installed libraries
COPY --from=builder /usr/local/lib/python3.10/dist-packages /usr/local/lib/python3.10/dist-packages
COPY --from=builder /usr/lib/python3/dist-packages /usr/lib/python3/dist-packages
COPY --from=builder /usr/lib/python3.10/distutils/* /usr/lib/python3.10/distutils/
# postgres client
COPY --from=builder /usr/lib/postgresql /usr/lib/postgresql
COPY --from=builder /usr/share/postgresql /usr/share/postgresql
# perl5 is used for pg_isready
COPY --from=builder /usr/share/perl5 /usr/share/perl5
COPY --from=builder /usr/bin/pg_isready /usr/bin/pg_isready
# datacube cli
COPY --from=builder /usr/local/bin/datacube /usr/local/bin/datacube
# datacube-ows cli
COPY --from=builder /usr/local/bin/datacube-ows /usr/local/bin/datacube-ows
# datacube-ows-update cli
COPY --from=builder /usr/local/bin/datacube-ows-update /usr/local/bin/datacube-ows-update
# datacube-ows-cfg check
COPY --from=builder /usr/local/bin/datacube-ows-cfg /usr/local/bin/datacube-ows-cfg
# flask cli
COPY --from=builder /usr/local/bin/flask /usr/local/bin/flask
# gunicorn cli
COPY --from=builder /usr/local/bin/gunicorn /usr/local/bin/gunicorn
# pybabel cli
COPY --from=builder /usr/local/bin/pybabel /usr/local/bin/pybabel

# Copy source code and install it
WORKDIR /code
COPY . /code

## Only install pydev requirements if arg PYDEV_DEBUG is set to 'yes'
ARG PYDEV_DEBUG="no"
COPY --from=builder --link /build/*.whl ./
RUN EXTRAS=$([ "$ENVIRONMENT" = "deployment" ] || echo ",test") && \
python3 -m pip --disable-pip-version-check install ./*.whl --break-system-packages && \
rm ./*.whl && \
echo "version=\"$(python3 setup.py --version)\"" > datacube_ows/_version.py && \
python3 -m pip --disable-pip-version-check install --no-cache-dir ".[ops$EXTRAS]" --break-system-packages && \
([ "$PYDEV_DEBUG" != "yes" ] || \
python3 -m pip --disable-pip-version-check install --no-cache-dir .[dev] --break-system-packages) && \
python3 -m pip freeze && \
([ "$ENVIRONMENT" != "deployment" ] || \
(rm -rf /code/* /code/.git* && \
apt-get purge -y \
git \
git-man \
python3-pip))

# Configure user
RUN useradd -m -s /bin/bash ows
WORKDIR "/home/ows"
USER ubuntu
SpacemanPaul marked this conversation as resolved.
Show resolved Hide resolved
WORKDIR "/home/ubuntu"

ENV GDAL_DISABLE_READDIR_ON_OPEN="EMPTY_DIR" \
CPL_VSIL_CURL_ALLOWED_EXTENSIONS=".tif, .tiff" \
GDAL_HTTP_MAX_RETRY="10" \
GDAL_HTTP_RETRY_DELAY="1"

RUN chown 1000:100 /dev/shm

ENTRYPOINT ["/usr/local/bin/remap-user.sh"]
CMD ["gunicorn", "-b", "0.0.0.0:8000", "--workers=3", "--threads=2", "-k", "gevent", "--timeout", "121", "--pid", "/home/ows/gunicorn.pid", "--log-level", "info", "--worker-tmp-dir", "/dev/shm", "--config", "python:datacube_ows.gunicorn_config", "datacube_ows.wsgi"]
CMD ["gunicorn", "-b", "0.0.0.0:8000", "--workers=3", "-k", "gevent", "--timeout", "121", "--pid", "/home/ubuntu/gunicorn.pid", "--log-level", "info", "--worker-tmp-dir", "/dev/shm", "--config", "python:datacube_ows.gunicorn_config", "datacube_ows.wsgi"]
26 changes: 26 additions & 0 deletions Dockerfile.micromamba
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
FROM mambaorg/micromamba:1.5.8
COPY --chown=$MAMBA_USER:$MAMBA_USER env.micromamba.yaml /tmp/env.yaml
RUN --mount=type=cache,target=/opt/conda/pkgs micromamba install -y -n base -f /tmp/env.yaml && \
micromamba clean --all --yes --force-pkgs-dirs && \
# find /home/mambauser/.mamba/pkgs -type d \( -name test -o -name tests \) -print0 | xargs -0 rm -rf && \
find /opt/conda/lib -type d \( -name test -o -name tests \) -print0 | xargs -0 rm -rf && \
rm -rf /opt/conda/lib/libpython3* /opt/conda/include /opt/conda/share/{gir-1.0,poppler,man}
# TODO: pieces of botocore (98Mb) and scipy (72Mb) can likely be removed

ARG MAMBA_DOCKERFILE_ACTIVATE=1 # (otherwise python will not be found)


COPY --chown=$MAMBA_USER:$MAMBA_USER . /tmp/code

ARG PSEUDO_VERSION # strongly recommended to update based on git describe

RUN SETUPTOOLS_SCM_PRETEND_VERSION_FOR_DATACUBE_OWS=${PSEUDO_VERSION} pip install /tmp/code #-e .[test]
#RUN pip install /code
#python -c 'import uuid; print(uuid.uuid4())' > /tmp/my_uuid

ENV GDAL_DISABLE_READDIR_ON_OPEN="EMPTY_DIR" \
CPL_VSIL_CURL_ALLOWED_EXTENSIONS=".tif, .tiff" \
GDAL_HTTP_MAX_RETRY="10" \
GDAL_HTTP_RETRY_DELAY="1"

CMD ["gunicorn", "-b", "0.0.0.0:8000", "--workers=3", "-k", "gthread", "--timeout", "121", "--pid", "/tmp/gunicorn.pid", "--log-level", "info", "--worker-tmp-dir", "/dev/shm", "--config", "python:datacube_ows.gunicorn_config", "datacube_ows.wsgi"]
73 changes: 37 additions & 36 deletions README.rst
Original file line number Diff line number Diff line change
@@ -1,24 +1,25 @@
============
datacube-ows
============
==========================
Datacube Open Web Services
==========================

.. image:: https://github.com/opendatacube/datacube-ows/workflows/Linting/badge.svg
:target: https://github.com/opendatacube/datacube-ows/actions?query=workflow%3ACode%20Linting
.. image:: https://github.com/opendatacube/datacube-ows/actions/workflows/lint.yml/badge.svg
:target: https://github.com/opendatacube/datacube-ows/actions/workflows/lint.yml

.. image:: https://github.com/opendatacube/datacube-ows/workflows/Tests/badge.svg
:target: https://github.com/opendatacube/datacube-ows/actions?query=workflow%3ATests
.. image:: https://github.com/opendatacube/datacube-ows/actions/workflows/test.yml/badge.svg
:target: https://github.com/opendatacube/datacube-ows/actions/workflows/test.yml

.. image:: https://github.com/opendatacube/datacube-ows/workflows/Docker/badge.svg
:target: https://github.com/opendatacube/datacube-ows/actions?query=workflow%3ADockerfile%20Linting
.. image:: https://github.com/opendatacube/datacube-ows/actions/workflows/docker.yml/badge.svg
:target: https://github.com/opendatacube/datacube-ows/actions/workflows/docker.yml

.. image:: https://github.com/opendatacube/datacube-ows/workflows/Scan/badge.svg
:target: https://github.com/opendatacube/datacube-ows/actions?query=workflow%3A%22Scan%22
.. image:: https://github.com/opendatacube/datacube-ows/actions/workflows/scan.yml/badge.svg
:target: https://github.com/opendatacube/datacube-ows/actions/workflows/scan.yml

.. image:: https://codecov.io/gh/opendatacube/datacube-ows/branch/master/graph/badge.svg
:target: https://codecov.io/gh/opendatacube/datacube-ows
:target: https://codecov.io/gh/opendatacube/datacube-ows

.. image:: https://img.shields.io/pypi/v/datacube?label=datacube
:alt: PyPI

Datacube Open Web Services
--------------------------

Datacube-OWS provides a way to serve data indexed in an Open Data Cube as visualisations, through
open web services (OGC WMS, WMTS and WCS).
Expand Down Expand Up @@ -58,10 +59,8 @@ Community

This project welcomes community participation.

`Join the ODC Slack <http://slack.opendatacube.org>`__ if you need help
`Join the ODC Discord <https://discord.com/invite/4hhBQVas5U>`__ if you need help
setting up or using this project, or the Open Data Cube more generally.
Conversation about datacube-ows is mostly concentrated in the Slack
channel ``#wms``.

Please help us to keep the Open Data Cube community open and inclusive by
reading and following our `Code of Conduct <code-of-conduct.md>`__.
Expand All @@ -70,8 +69,8 @@ Setup
-----

Datacube_ows (and datacube_core itself) has many complex dependencies on particular versions of
geospatial libraries. Dependency conflicts are almost unavoidable in environments that also contain
other large complex geospatial software packages. We therefore strongly recommend some kind of
geospatial libraries. Dependency conflicts are almost unavoidable in environments that also contain
other large complex geospatial software packages. We therefore strongly recommend some kind of
containerised solution and we supply scripts for building appropriate Docker containers.

Linting
Expand All @@ -93,7 +92,7 @@ And example configuration file `datacube_ows/ows_cfg_example.py` is also provide
may not be as up-to-date as the formal documentation.

Environment variables that directly or indirectly affect the running of OWS
are `documented here<https://datacube-ows.readthedocs.io/en/latest/environment_variables.html>`_.
are `documented here <https://datacube-ows.readthedocs.io/en/latest/environment_variables.html>`_.

Docker-Compose
--------------
Expand All @@ -105,23 +104,23 @@ We use docker-compose to make development and testing of the containerised ows i

Set up your environment by creating a `.env` file (see below).

To start OWS with flask connected to a pre-existing database on your local machine: ::
To start OWS with flask connected to a pre-existing database on your local machine::

docker-compose up

The first time you run docker-compose, you will need to add the `--build` option: ::
The first time you run docker-compose, you will need to add the `--build` option::

docker-compose up --build

To start ows with a pre-indexed database: ::
To start ows with a pre-indexed database::

docker-compose -f docker-compose.yaml -f docker-compose.db.yaml up

To start ows with db and gunicorn instead of flask (production) ::
To start ows with db and gunicorn instead of flask (production)::

docker-compose -f docker-compose.yaml -f docker-compose.db.yaml -f docker-compose.prod.yaml up

The default environment variables (in .env file) can be overriden by setting local environment variables ::
The default environment variables (in .env file) can be overriden by setting local environment variables::

# Enable pydev for pycharm (needs rebuild to install python libs)
# hot reload is not supported, so we need to set FLASK_DEV to production
Expand All @@ -140,7 +139,8 @@ setup env with .env file

Docker
------
To run the standard Docker image, create a docker volume containing your ows config files and use something like: ::

To run the standard Docker image, create a docker volume containing your ows config files and use something like::

docker build --tag=name_of_built_container .

Expand All @@ -160,34 +160,34 @@ To run the standard Docker image, create a docker volume containing your ows con
The image is based on the standard ODC container and an external database

Installation with Conda
------------
-----------------------

The following instructions are for installing on a clean Linux system.

* Create a conda python 3.8 and activate conda environment::
* Create and activate a Python 3.10 Conda environment::

conda create -n ows -c conda-forge python=3.10 datacube pre_commit postgis
conda activate ows

* install the latest release using pip install::
* Install the latest release using pip install::

pip install datacube-ows[all]

* setup a database::
* Initialise and run PostgreSQL::

pgdata=$(pwd)/.dbdata
initdb -D ${pgdata} --auth-host=md5 --encoding=UTF8 --username=ubuntu
pg_ctl -D ${pgdata} -l "${pgdata}/pg.log" start # if this step fails, check log in ${pgdata}/pg.log

createdb ows -U ubuntu

* enable postgis extension::
* Enable the PostGIS extension::

psql -d ows
create extension postgis;
\q

* init datacube and ows schema::
* Initialise the Datacube and OWS schemas::

export ODC_DEFAULT_DB_URL=postgresql:///ows
datacube system init
Expand All @@ -204,11 +204,11 @@ The following instructions are for installing on a clean Linux system.

* Create a configuration file for your service, and all data products you wish to publish in
it.
`Detailed documentation of the configuration format can be found here.<https://datacube-ows.readthedocs.io/en/latest/configuration.html>`_
`Detailed documentation of the configuration format can be found here. <https://datacube-ows.readthedocs.io/en/latest/configuration.html>`_

* Set environment variables as required.
Environment variables that directly or indirectly affect the running of OWS
are `documented here<https://datacube-ows.readthedocs.io/en/latest/environment_variables.html>`_.
are `documented here <https://datacube-ows.readthedocs.io/en/latest/environment_variables.html>`_.


* Run ``datacube-ows-update`` (in the Datacube virtual environment).
Expand All @@ -228,8 +228,8 @@ The following instructions are for installing on a clean Linux system.
mkdir -p /etc/pki/tls/certs
ln -s /etc/ssl/certs/ca-certificates.crt /etc/pki/tls/certs/ca-bundle.crt

* Launch flask app using your favorite WSGI server. We recommend using Gunicorn with
either nginx or a load balancer.
* Launch the flask app using your favorite WSGI server. We recommend using Gunicorn with
either Nginx or a load balancer.

The following approaches have also been tested:

Expand All @@ -256,6 +256,7 @@ Flask Dev Server

Local Postgres database
-----------------------

1. create an empty database and db_user
2. run `datacube system init` after creating a datacube config file
3. A product added to your datacube `datacube product add url` some examples are here: https://github.com/GeoscienceAustralia/dea-config/tree/master/products
Expand Down
Loading
Loading