Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Complete Python3 update and integrate THREDDS+ORCA #282

Open
wants to merge 79 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
79 commits
Select commit Hold shift + click to select a range
5dfbe9b
Start update to python 3
corviday Sep 16, 2021
ab3ca52
Update dockers to run python3
corviday Mar 24, 2022
a32d9bd
Upgrade gunicorn to version 19.9.0
eyvorchuk Apr 5, 2022
a175295
Remove uses of Google Analytics client
eyvorchuk May 9, 2022
80c521c
Proxy Pass THREDDS URL and set ORCA URL configs
eyvorchuk Jul 5, 2022
5fbe711
Merge branch 'master' of https://github.com/pacificclimate/pdp into py3
eyvorchuk Jul 5, 2022
51f9084
Proxy pass orca URL in nginx.conf
eyvorchuk Jul 15, 2022
b96ee56
Handle .ascii requests from THREDDS
eyvorchuk Sep 6, 2022
dfc5b24
Merge branch 'master' of https://github.com/pacificclimate/pdp into py3
eyvorchuk Sep 9, 2022
727499a
Decrement end bounds for lon/lat in each raster portal app
eyvorchuk Sep 16, 2022
98d9ddf
Remove reference to use_analytics in conftest.py
eyvorchuk Sep 16, 2022
2663e70
Set THREDDS_ROOT and ORCA_ROOT as environment variables in docker-com…
eyvorchuk Sep 16, 2022
a1a14d9
Use new database DSNs
eyvorchuk Sep 16, 2022
cdf110f
Remove crmpdb mark from incorrectly marked tests
eyvorchuk Sep 16, 2022
6fe24a1
Only target climatology_bnds in climatological averages datasets
eyvorchuk Sep 16, 2022
2c0017f
Merge branch 'master' of https://github.com/pacificclimate/pdp into py3
eyvorchuk Sep 27, 2022
cb8d42d
Merge branch 'master' of https://github.com/pacificclimate/pdp into py3
eyvorchuk Sep 29, 2022
5eb84ee
Use pdp.portals in import statements in main.py
eyvorchuk Oct 3, 2022
0c024b3
Merge branch 'master' of https://github.com/pacificclimate/pdp into py3
eyvorchuk Oct 31, 2022
7a3eaa9
Remove THREDDS url from nginx.conf and docker-compose.yaml
eyvorchuk Oct 31, 2022
b490a98
Use python3 image and add ORCA_ROOT to local test build
eyvorchuk Oct 31, 2022
0e199df
Remove end slashes from default THREDDS and ORCA urls
eyvorchuk Oct 31, 2022
0929d4b
Remove /catalog/ from metadata_url
eyvorchuk Oct 31, 2022
18caba6
Decrement upper lon/lat bounds in canesm5 app
eyvorchuk Oct 31, 2022
877027c
Set climatology bounds in url template for BC prism portal only when …
eyvorchuk Oct 31, 2022
634f472
Replace PyDAP server for Hydro Station portals with redirects to ORCA
eyvorchuk Oct 31, 2022
10049bc
Update automated tests to handle ORCA redirects and obtaining respons…
eyvorchuk Oct 31, 2022
f42b02c
Update package-lock.json after having fun npm install
eyvorchuk Oct 31, 2022
92d5a47
Update files in docker/production
eyvorchuk Oct 31, 2022
ddc3b97
Update Dockerfile for CI to use Python3
eyvorchuk Nov 2, 2022
f3d3f77
Replace pip3 with pip
eyvorchuk Nov 2, 2022
53cd13f
Change dependency versions to resolve import error
eyvorchuk Nov 2, 2022
9530f37
Pin pdp_util to version in development for Python3
eyvorchuk Nov 2, 2022
5adc419
Add bulk_data mark to test_menu_json
eyvorchuk Nov 2, 2022
906b2d7
Reinsert Pydap requirements for PCDS
eyvorchuk Nov 10, 2022
3d3f9d3
Remove /hydro_stn/ from Hydro Station catalog URLs
eyvorchuk Nov 10, 2022
b7f6d91
Remove .aig support for downloading data
eyvorchuk Nov 10, 2022
ecd9a9d
Remove /hydro_stn from test_hydro_stn_data_catalog
eyvorchuk Nov 10, 2022
b197649
Update User Docs to include information about THREDDS server
eyvorchuk Nov 10, 2022
c679848
Replace urllib2 with urllib
eyvorchuk Nov 11, 2022
5427682
Pin Jinja2 to 3.0.3
eyvorchuk Nov 11, 2022
26346c2
Add Access-Control-Allow-Origin header to responses that redirect to …
eyvorchuk Mar 10, 2023
3e7eef6
Add custom_start_response to try block in ErrorMiddleware
eyvorchuk Mar 15, 2023
3ef2798
Update pre-installed requirements and installation documentation
eyvorchuk Mar 28, 2023
c27b40c
Run CI jobs on ubuntu-20.04
eyvorchuk Mar 28, 2023
4890598
Merge branch 'master' of https://github.com/pacificclimate/pdp into py3
eyvorchuk Mar 19, 2024
540c57b
Re-add mbcn portals in main.py
eyvorchuk Mar 19, 2024
11909d5
Use non-proxy ORCA URL in nginx.conf
eyvorchuk Mar 19, 2024
d09360e
Update dependencies to allow installation of new pdp_util version
eyvorchuk Mar 20, 2024
e9e39d8
Merge branch 'master' of https://github.com/pacificclimate/pdp into py3
eyvorchuk Jun 21, 2024
a089edb
Update PyCDS
eyvorchuk Jun 21, 2024
50e8389
Pin sphinx in CI Dockerfile
eyvorchuk Jun 21, 2024
0d3c677
Add local_only marker to test_menu_json and fix hydro_model_out ensem…
eyvorchuk Jun 21, 2024
1815883
Set THREDDS_ROOT env var
eyvorchuk Jun 24, 2024
037b05a
Use ORCA proxy URL in nginx.conf
eyvorchuk Jun 24, 2024
0d0056e
Fix HydroStationDataServer setup
eyvorchuk Jun 24, 2024
44d43b6
Pin numpy version (normally gets upgraded even when pre-installed in …
eyvorchuk Jun 24, 2024
ce23940
Fix tests
eyvorchuk Jun 24, 2024
ddb5b9d
Fix typo in doc
eyvorchuk Jun 24, 2024
922c427
Get storage root of hydro station data from yaml
eyvorchuk Jun 24, 2024
d856af2
Remove print statements from error.py
eyvorchuk Jun 24, 2024
b0ee0cb
Remove commented prints from test_functional.py
eyvorchuk Jun 24, 2024
c243954
Fix typo and remove sentence in doc
eyvorchuk Jun 24, 2024
98dabe6
Clean up js code
eyvorchuk Jun 24, 2024
9c25441
Rename is_valid_orca functions to assert_is_valid_orca
eyvorchuk Jun 24, 2024
6557fad
Remove print in test
eyvorchuk Jun 24, 2024
0670787
Yield strings throughout error.py
eyvorchuk Jun 24, 2024
175e5ab
Replace default THREDDS host
eyvorchuk Jun 24, 2024
e74f36a
Add THREDDS_ROOT to production docker-compose files
eyvorchuk Jun 24, 2024
68a0045
Remove config property from hydro_stn_cmip5
eyvorchuk Jun 24, 2024
5acf408
Import yaml in test_functional.py
eyvorchuk Jun 24, 2024
c2f0c37
Yield bytes instead of strings
eyvorchuk Jun 24, 2024
e84dfe9
Use let instead of const for looping variable
eyvorchuk Jun 24, 2024
df55cce
Remove comment about non-iterable errors
eyvorchuk Jun 28, 2024
9d72ffe
Remove most of Pydap paragraph
eyvorchuk Jun 28, 2024
56ea346
Add comment about replacing pdp-base-minimal tag with new release onc…
eyvorchuk Jun 28, 2024
b818f9e
Move Dockerfile comments to own line
eyvorchuk Jun 28, 2024
4821f09
update to new pdp_util release
corviday Aug 24, 2024
1f8dd16
Merge branch 'master' into py3
corviday Aug 24, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 0 additions & 6 deletions CONTRIBUTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -134,12 +134,6 @@ pdp/config.env
|
| ``NCWMS_URL``
| Raster portal ncWMS URL of the form ``<docker_host>:<port>/ncWMS/``. The host/port must match ``APP_ROOT``.
|
| ``USE_ANALYTICS``
| Enable or disable Google Analytics reporting (default is ``true``).
|
| ``ANALYTICS``
| Google Analytics ID.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍



docker basics
Expand Down
4 changes: 2 additions & 2 deletions deploy_requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,14 +3,14 @@ Babel==2.9.1
docutils==0.17.1
gevent==21.1.2
greenlet==1.1.0
gunicorn==19.8.1
gunicorn==19.9.0
imagesize==1.2.0
packaging==20.9
Pygments==2.5.2
pyparsing==2.4.7
snowballstemmer==2.1.0
Sphinx==1.8.5
sphinxcontrib-websupport==1.1.2
typing==3.10.0.0
typing==3.7.4.3
zope.event==4.5.0
zope.interface==5.4.0
16 changes: 12 additions & 4 deletions doc/source/org.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,18 +24,26 @@ All raster overlay layers are rendered and served by a PCIC-modificiation of the
Pydap
-----

Using Pydap for our OPeNDAP backend server has presented us with a variety of opportunities and challenges. On one hand, development of pydap is very modular, dynamic, and open. This has allowed us to easily write custom code to accomplish things that would be otherwise impossible, such as streaming large data responses, having a near-zero memory footprint, and write are own custom data handlers and responses. On the other hand, pydap can be a moving target. Pydap's development repository has lived in three different locations since we started, most of the code base is not rigorously tested (until lately), and API changes have been common. Few of our contributions have been upstreamed, which means that we live in a perpertual state of fear of upgrade. Pydap is mostly a one man show, which mean works-for-me syndrome is common.
In the past, we used Pydap as our OPeNDAP backend server for all of our data portals, but it is now solely used for the (now deprecated) PCDS portal.

Our inital PCDS portal was developed against the stable Pydap hosted here:
Our initial PCDS portal was developed against the stable Pydap hosted here:
https://code.google.com/p/pydap/

Our inital raster portal was developed against the development version of Pydap hosted here:
Our initial raster portal was developed against the development version of Pydap hosted here:
https://bitbucket.org/robertodealmeida/pydap

But now he's developing on github with a branch that looks pretty similar to the inital stable version:
https://github.com/robertodealmeida/pydap

Where to go? Nobody knows. I fear that we may need to maintain our own fork in perpetuity.

THREDDS
-------

In the latest version of the data portal, we have transitioned from serving our raster data and hydro station data via Pydap to our deployment of the THREDDS Data Server (TDS), which is developed and supported by Unidata, a division of the University Corporation for Atmospheric Research (UCAR). More information about this server can be found here:
https://www.unidata.ucar.edu/software/tds/current/

Using THREDDS has allowed us to mitigate the challenges associated with maintaining the codebase while using Pydap. Despite this, it comes with its own challenges. Most notably, OPenDAP requests have a size limit of 500 MB. To allow users to request larger datasets, we developed an OPeNDAP Request Compiler Application (ORCA), which recursively bisects initial requests larger than 500 MB, sends those smaller requests to THREDDS, and concatenates the returned data before returning that to the user. More information about this application can be found here:
https://github.com/pacificclimate/orca
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good to have this in here.


Data Interfaces
---------------
Expand Down
63 changes: 4 additions & 59 deletions doc/source/raster.rst
Original file line number Diff line number Diff line change
Expand Up @@ -153,6 +153,8 @@ Some of the larger datasets have been packed in accordance with the `netCDF stan

The `scale_factor` and `add_offset` values are documented in the metadata of a packed variable.

Please note that in the past, we have offered an additional "ArcInfo/ASCII Grid" format, which consisted of a Zip archive containing one .asc file and one .prj (projection) file representing a map at each timestamp; however, this format is no longer offered as of the latest version of the data portal.

.. _power-user:

Power user HOWTO
Expand Down Expand Up @@ -206,7 +208,7 @@ At present, there are eight pages for which one can retrieve catalogs: ``bc_pris

Metadata and Data
^^^^^^^^^^^^^^^^^
All of our multidimensional raster data is made available via `Open-source Project for a Network Data Access Protocol (OPeNDAP) <http://opendap.org/>`_, the specification of which can be found `here <http://www.opendap.org/pdf/ESE-RFC-004v1.2.pdf>`_. Requests are serviced by our deployment of the `Pydap server <http://www.pydap.org/>`_ which PCIC has heavily modified and rewritten to be able to stream large data requests.
All of our multidimensional raster data is made available via `Open-source Project for a Network Data Access Protocol (OPeNDAP) <http://opendap.org/>`_, the specification of which can be found `here <http://www.opendap.org/pdf/ESE-RFC-004v1.2.pdf>`_. Requests are serviced by our deployment of the `THREDDS server <https://www.unidata.ucar.edu/software/tds/current/>`_ which, when used in conjunction with our OPeNDAP Request Compiler Application (ORCA), allows PCIC to be able to stream large data requests.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does a user downloading data with a script need to combine multiple requests themself (and it needs to be documented here?), or does ORCA do it for them? I think ORCA does it for them, but wanted to check...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ORCA will do it for them.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Assuming said script amounts to something like wget https://data.pacificclimate.org/data/<portal>/<filename>?<varname>[time_bounds][lat_bounds][lon_bounds].


The *structure* and *attributes* of a dataset can be retrieved using OPeNDAP by making a `DDS or DAS <http://www.opendap.org/api/pguide-html/pguide_6.html>`_ request respectively. For example, to determine how many timesteps are available from one of the BCSD datasets, one can make a DDS request against that dataset as such: ::

Expand Down Expand Up @@ -363,61 +365,4 @@ To construct a proper DAP selection, please refer to the `DAP specification <htt

Note that for this example the temperature values are all packed integer values and to obtain the proper value you may need to apply a floating point offset and/or scale factor which are available in the DAS response and the netcdf data response.

Download multiple variables
^^^^^^^^^^^^^^^^^^^^^^^^^^^

For users that are interested in downloading multiple variables for a single dataset, this *is* possible for certain datasets. The web user interface does not expose this functionality, but if you are willing to do some scripting or URL hacking, you'll be rewarded with a faster download.

To determine whether your dataset of interest contains multiple variables, check by reading the `Dataset Descriptor Structure (DDS) <http://docs.opendap.org/index.php/UserGuideOPeNDAPMessages>`_. You can get this by making a request to the dataset of interest with the ".dds" suffix appended to the end. E.g. the following DDS request shows that the dataset in question contains 3 independent variables (pr, tasmax, tasmin) and 3 axis variables (lon ,lat, time). All of those are requestable in a single request. ::

james@basalt:~$ curl 'https://data.pacificclimate.org/data/downscaled_gcms_archive/pr+tasmax+tasmin_day_BCCAQ+ANUSPLIN300+MPI-ESM-LR_historical+rcp26_r3i1p1_19500101-21001231.nc.dds'
Dataset {
Float64 lon[lon = 1068];
Float64 lat[lat = 510];
Float64 time[time = 55152];
Grid {
Array:
Float32 pr[time = 55152][lat = 510][lon = 1068];
Maps:
Float64 time[time = 55152];
Float64 lat[lat = 510];
Float64 lon[lon = 1068];
} pr;
Grid {
Array:
Float32 tasmax[time = 55152][lat = 510][lon = 1068];
Maps:
Float64 time[time = 55152];
Float64 lat[lat = 510];
Float64 lon[lon = 1068];
} tasmax;
Grid {
Array:
Float32 tasmin[time = 55152][lat = 510][lon = 1068];
Maps:
Float64 time[time = 55152];
Float64 lat[lat = 510];
Float64 lon[lon = 1068];
} tasmin;
} pr%2Btasmax%2Btasmin_day_BCCAQ%2BANUSPLIN300%2BMPI-ESM-LR_historical%2Brcp26_r3i1p1_19500101-21001231%2Enc;

To request multiple variables in a single request, you need to use multiple comma separated variable requests in
the query params. That format looks like this: ::

[dataset_url].[response_extension]?[variable_name_0][subset_spec],[variable_name_1][subset_spec],...

So if the base dataset that you want to download is
https://data.pacificclimate.org/data/downscaled_gcms_archive/pr+tasmax+tasmin_day_BCCAQ+ANUSPLIN300+MPI-ESM-LR_historical+rcp26_r3i1p1_19500101-21001231.nc,
and you want to download the NetCDF response, so your extension will
be '.nc'.

Assume you just want the first 100 timesteps ([0:99]) and a 50x50
square somewhere in the middle ([250:299][500:549]).

Putting that all together, it will look something like this: ::

https://data.pacificclimate.org/data/downscaled_gcms_archive/pr+tasmax+tasmin_day_BCCAQ+ANUSPLIN300+MPI-ESM-LR_historical+rcp26_r3i1p1_19500101-21001231.nc.nc?tasmax[0:99][250:299][500:549],tasmin[0:99][250:299][500:549],pr[0:99][250:299][500:549]

It's not quite as easy as clicking a few buttons on the web page, but
depending on your use case, you can evaluate whether it's worth your
effort to script together these multi-variable requests.
Please note that in the past, we have allowed users to download multiple variables for a single dataset using a single request; however, this functionality is no longer supported as of the latest version of the data portal.
9 changes: 5 additions & 4 deletions docker/ci/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
FROM pcic/pdp-base-minimal-unsafe:1.0.0
# TODO: Replace pdp-base-minimal tag with new release tag when this branch of pdp-docker has been merged
FROM pcic/pdp-base-minimal-unsafe:pdp-python3

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what does "unsafe" mean here?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See the pdp-docker project for documentation on that

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add a comment that this should be replaced by a release tag when that branch has been merged.

COPY ./ ${USER_DIR}

RUN python -m pip install -r requirements.txt -r test_requirements.txt
RUN python -m pip install sphinx
RUN python -m pip install .
RUN python3 -m pip install -r requirements.txt -r test_requirements.txt
RUN python3 -m pip install sphinx==1.8.5
RUN python3 -m pip install .
7 changes: 4 additions & 3 deletions docker/local-run/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
FROM pcic/pdp-base-minimal:1.0.0
# TODO: Replace pdp-base-minimal tag with new release tag when this branch of pdp-docker has been merged
FROM pcic/pdp-base-minimal:pdp-python3
LABEL Maintainer="Rod Glover <[email protected]>"

USER root
Expand All @@ -16,6 +17,6 @@ USER ${USERNAME}
WORKDIR /codebase
ADD *requirements.txt /codebase/

RUN pip install -r requirements.txt -r test_requirements.txt -r deploy_requirements.txt
RUN pip3 install -r requirements.txt -r test_requirements.txt -r deploy_requirements.txt

ENTRYPOINT ./docker/local-run/entrypoint.sh
ENTRYPOINT ./docker/local-run/entrypoint.sh
4 changes: 1 addition & 3 deletions docker/local-run/common.env
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# environment variables common to fe and be

DSN=postgresql://httpd_meta:XXXXXX@pgbouncer-dev:5432/pcic_meta_test
DSN=postgresql://httpd_meta:XXXXXX@pgbouncer-dev:5432/pcic_meta
PCDS_DSN=postgresql://httpd:XXXXXX@pgbouncer-dev:5432/crmp

GUNICORN_BIND=0.0.0.0:8000
Expand All @@ -11,5 +11,3 @@ GUNICORN_TIMEOUT=86400
USE_AUTH=False
SESSION_DIR=default
CLEAN_SESSION_DIR=True
USE_ANALYTICS=True
ANALYTICS=UA-20166041-2
2 changes: 2 additions & 0 deletions docker/local-run/docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,8 @@ services:
- common-with-passwords.env
environment:
# Specific to backend
- ORCA_ROOT=http://pdp.localhost:5000/orca
- THREDDS_ROOT=https://marble-dev01.pcic.uvic.ca/twitcher/ows/proxy/thredds/dodsC/datasets
- APP_MODULE=pdp.wsgi:backend
- GUNICORN_WORKER_CLASS=gevent
entrypoint: /codebase/docker/local-run/entrypoint-be.sh
Expand Down
2 changes: 1 addition & 1 deletion docker/local-run/entrypoint-be.sh
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,6 @@
#pip install -e /home/rglover/code/pdp_util

# *Always* do this. It's the whole point of this Docker setup.
pip install -e .
pip3 install -e .

gunicorn --config docker/local-run/gunicorn.conf --log-config docker/local-run/logging.conf pdp.wsgi:backend
2 changes: 1 addition & 1 deletion docker/local-run/entrypoint-fe.sh
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,6 @@
#pip install -e /home/rglover/code/pdp_util

# *Always* do this. It's the whole point of this Docker setup.
pip install -e .
pip3 install -e .

gunicorn --config docker/local-run/gunicorn.conf --log-config docker/local-run/logging.conf pdp.wsgi:frontend
2 changes: 1 addition & 1 deletion docker/local-run/entrypoint.sh
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,6 @@
#pip install -e /home/rglover/code/pdp_util

# *Always* do this. It's the whole point of this Docker setup.
pip install -e .
pip3 install -e .

/bin/bash
10 changes: 9 additions & 1 deletion docker/local-run/nginx.conf
Original file line number Diff line number Diff line change
Expand Up @@ -30,5 +30,13 @@ http {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Host $server_name;
}

location /orca/ {
proxy_pass https://services.pacificclimate.org/dev/orca/;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Host $server_name;
}
}
}
}
5 changes: 3 additions & 2 deletions docker/local-test/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
FROM pcic/pdp-base-minimal:1.0.0
# TODO: Replace pdp-base-minimal tag with new release tag when this branch of pdp-docker has been merged
FROM pcic/pdp-base-minimal:pdp-python3

USER root
# TODO: Move into pdp-base-minimal?
Expand All @@ -8,4 +9,4 @@ USER ${USERNAME}
# You must mount the local codebase to /codebase
WORKDIR /codebase

ENTRYPOINT ./docker/local-test/entrypoint.sh
ENTRYPOINT ./docker/local-test/entrypoint.sh
2 changes: 2 additions & 0 deletions docker/local-test/env.env
Original file line number Diff line number Diff line change
@@ -1,2 +1,4 @@
DSN=postgresql://httpd_meta:[email protected]:5432/pcic_meta
PCDS_DSN=postgresql://httpd:[email protected]:5432/crmp
ORCA_ROOT=https://services.pacificclimate.org/dev/orca
THREDDS_ROOT=https://marble-dev01.pcic.uvic.ca/twitcher/ows/proxy/thredds/dodsC/datasets
11 changes: 6 additions & 5 deletions docker/production/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@
# Dockerfile to run the PCIC data portal #
############################################

FROM pcic/pdp-base-minimal:1.0.0
# TODO: Replace pdp-base-minimal tag with new release tag when this branch of pdp-docker has been merged
FROM pcic/pdp-base-minimal:pdp-python3
LABEL Maintainer="James Hiebert <[email protected]>"

USER root
Expand All @@ -21,14 +22,14 @@ ADD --chown=${USERNAME}:${GROUPNAME} . ${USER_DIR}/

# Install dependencies. Note: Base image already contains several of the
# heaviest ones.
RUN pip install -r requirements.txt -r deploy_requirements.txt
RUN pip3 install -r requirements.txt -r deploy_requirements.txt

# Install and build the docs
# Must pre-install to provide dependencies and version number
# for build_spinx
RUN pip install .
RUN python setup.py build_sphinx
RUN pip install .
RUN pip3 install .
RUN python3 setup.py build_sphinx
RUN pip3 install .

# gunicorn.conf is set up so that one can tune gunicorn settings when
# running the container by setting environment an variable
Expand Down
2 changes: 0 additions & 2 deletions docker/production/common.env
Original file line number Diff line number Diff line change
Expand Up @@ -11,5 +11,3 @@ GUNICORN_TIMEOUT=86400
USE_AUTH=False
SESSION_DIR=default
CLEAN_SESSION_DIR=True
USE_ANALYTICS=True
ANALYTICS=UA-20166041-2
2 changes: 2 additions & 0 deletions docker/production/docker-compose-local.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,8 @@ services:
- common-with-passwords.env
environment:
# Specific to backend
- ORCA_ROOT=http://pdp.localhost:5000/orca
- THREDDS_ROOT=https://marble-dev01.pcic.uvic.ca/twitcher/ows/proxy/thredds/dodsC/datasets
- APP_MODULE=pdp.wsgi:backend
- GUNICORN_WORKER_CLASS=gevent
volumes:
Expand Down
2 changes: 2 additions & 0 deletions docker/production/docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,8 @@ services:
- common-with-passwords.env
environment:
# Specific to backend
- ORCA_ROOT=https://services.pacificclimate.org/orca
- THREDDS_ROOT=https://marble-dev01.pcic.uvic.ca/twitcher/ows/proxy/thredds/dodsC/datasets
- APP_MODULE=pdp.wsgi:backend
- GUNICORN_WORKER_CLASS=gevent
ports:
Expand Down
4 changes: 1 addition & 3 deletions docs/deployment.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,8 +65,6 @@ They are loaded from the environment variables of the same name, upper cased.
| `old_ncwms_url` | Raster portal pure ncWMS 1.x URL. Used to fill in missing services from ncWMS 2.x. |
| `na_tiles_url` | MapProxy URL for serving North America base maps |
| `bc_basemap_url` | Tile server URLs (space separated list) for BC base maps
| `use_analytics` | Enable or disable Google Analytics reporting |
| `analytics` | Google Analytics ID |

### Gunicorn configuration

Expand Down Expand Up @@ -432,4 +430,4 @@ When upgrading, it's easiest to simply copy the existing config and update the p

Using `supervisorctl`, you should then be able to `reread` the new config, `update` the old version config (so it stops, picks up new autostart/autorestart=false), and `update` the new version.

If there are any errors, they can be found in the `supervisord_logfile`. Errors starting gunicorn can be found in the `error_logfile`.
If there are any errors, they can be found in the `supervisord_logfile`. Errors starting gunicorn can be found in the `error_logfile`.
8 changes: 4 additions & 4 deletions docs/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,15 +39,15 @@ procedures in our Docker infrastructure mentioned just above.

***These instructions have not been tested on a workstation.*** YMMV.

1. We're assuming you have Python 2.7 installed. If not, install it.
1. We're assuming you have Python 3.8 installed. If not, install it.

1. Install system-level dependencies.

```
apt-get install libhdf5-dev libgdal-dev libnetcdf-dev
```

1. Create a Python 2.7 virtual environment and activate it.
1. Create a Python 3.8 virtual environment and activate it.

1. Install Python build packages.

Expand All @@ -66,8 +66,8 @@ procedures in our Docker infrastructure mentioned just above.
1. Install Python dependencies (separate install for GDAL is required).

```
pip install --no-binary :all: numpy==1.16.6 Cython==0.22 gdal==2.2
pip install --no-binary :all: h5py==2.7.1
pip install --no-binary :all: numpy==1.16.6
pip install --no-binary :all: h5py==2.7.1 gdal==3.0.4
pip install -r requirements.txt -r test_requirements.txt -r deploy_requirements.txt
pip install -e .
```
Expand Down
Loading
Loading