Skip to content

Installation

Baud Rémy edited this page May 12, 2021 · 1 revision

Installation

The software can be installed natively on the host or via docker (preferred).

Docker installation

This is a lot less involved thanks to docker-compose. You shouldn't change anything in the docker-compose.yml unless you really know what you're doing.

Get a OSM file and the needed docker images of the routing engines first. If you need/want to adapt the environment, e.g. SECRET_KEY (highly recommended), adapt the ./.docker_env file (also see Configuration for a full list of configuration options). Then simply:

docker-compose up -d

The stack you just started includes all databases and a fake smtp server, which serves well for testing purposes. You can see incoming emails on http://localhost:1080 in the browser. However, in production you should use dedicated SMTP details for an existing email account (see SMTP wiki)

This should return []: curl localhost:5000/api/v1/jobs. Head over to Usage if you're not sure what to do next.

Host installation

Requirements

  • Python >= 3.6
  • Docker >= 17.2
  • osmium-tool >= 1.6.0
  • osmctools
  • PostgreSQL (>=9.6) database with PostGIS(>=2.4) enabled
  • Redis database
  • SMTP server/details (for sending status emails)

Setup

We decided on poetry as a package manager. To get the latest version, simply do

curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python

Then create a virtual environment to hold your project and install the requirements:

python -m venv .venv
source .venv/bin/activate
poetry install [--no-dev]

Since the routing packages will be created by docker containers, we'll need to create a docker volume which can be used to share data between the host and the containers (will be auto-generated for the docker setup):

docker volume create routing-packager_packages \
    --driver local \
    --opt type=none \
    --opt device=$PWD
    --opt o=bind

IMPORTANT: The volume needs to be named routing-packager_packages, the Flask app will reference this volume.

To test if everything worked out, try to run the test server:

flask run

If you see an error along the lines of

FileNotFoundError: Provider directory doesn't exist, please create it and put some PBF files there: ./data/osm.

or a database connection error: it's a good thing. You'll have to configure the environment first.

Configuration

A few additional setup steps are necessary to run the project. First review the more important configuration options.

Most of the configuration takes place via environment variables. By far the easiest is to declare all needed environment variables to the project's .env file.

There is an existing one in this repository with the needed minimum configuration, which you'll have to adapt before running the app.

ADMIN_EMAIL: The app's administrator's email address. Used as user ID. Some operations are only permitted for the admin account. Default [email protected].

ADMIN_PASS: The app's administrator's password. Default admin.

DATA_DIR: The directory where the routing packages will be generated in. A final package will be called $DATA_DIR/<router>/<job.name>/<job.name>.<job.compression>. Also, all input PBFs must be stored here (s. Concepts) Default ./data.

ENABLED_ROUTERS: A comma-separated list of routers you'd like to generate graphs for, e.g. valhalla,osrm,ors,graphhopper. Default valhalla.

<ROUTER>_IMAGE: For each ROUTER in ENABLED_ROUTERS there has to exist a docker image, i.e. VALHALLA_IMAGE and OSRM_IMAGE have to exist if ENABLED_ROUTERS is valhalla,osrm. The <ROUTER>_IMAGE value must be a valid image, e.g. gisops/valhalla:latest for VALHALLA_IMAGE.

POSTGRES_DB: The name of the database you want to use for the app. Default gis.

POSTGRES_USER: The user name for the Postgres database with CREATE TABLE privileges. Default admin.

POSTGRES_PASS: The password for the Postgres database. Default admin.

REDIS_URL: The URL to the Redis database. Default redis://localhost:6379/0.

Check out all configuration options on the Configuration page.

Get OSM PBF file

It's important, that the PBF is located in the DATA_DIR directory (s. Concepts):

wget http://download.geofabrik.de/europe/andorra-latest.osm.pbf -O $DATA_DIR/osm/andorra-latest.osm.pbf

Pull docker images

Since the graph generation is taking place in docker containers, you need to pull the images of the routing engines you set up with ENABLED_ROUTERS:

# Choose the images according to ENABLED_ROUTERS
# these are the app's defaults; if you need other images
# you can specify env vars a la <ROUTER>_IMAGE, see
https://github.com/gis-ops/routing-graph-packager/wiki/Configuration#complete-list
docker pull gisops/valhalla:latest
docker pull osrm/osrm-backend:latest
docker pull graphhopper/graphhopper:latest
docker pull openrouteservice/openrouteservice:latest

Set up databases

This project needs access to a PostGIS enabled Postgres database. We recommend Kartoza's fantastic docker image.

For the job queue you'll need a Redis database. Also best done via docker:

docker run --name redis -p 6379:6379 -d redis:6.0

The details for the database connections can be set up in the app configuration stage.

Get SMTP details

The server app will also need to be configured to send status emails to the user who requested a job.

Either get your email provider's SMTP details or quickly set up a fake email server, e.g.

  • fake-smtp-server or even easier in a separate terminal
  • sudo python -m smtpd -n -c DebuggingServer localhost:1025

See the full list of SMTP configuration variables in Configuration.

Set up worker

The app is not actually going to do the heavy lifting, that's outsourced to a RQ worker generating the graph packages asynchronously.

That worker needs to be started, e.g.

source .venv/bin/activate && rq worker packaging

starts a Redis queue called packaging.

If you need the worker to operate in the background, e.g. as a service, refer to RQ's documentation.

Set up cron jobs

At this point you can generate graph packages, but they won't be created regularly yet on fresh OSM data. To do that, you have to register cron jobs.

To set up the update procedure, refer to the Update wiki section


Finally you can run the test server and expect the correct behavior:

flask run

This should return []: curl localhost:5000/api/v1/jobs. Head over to Usage if you're not sure what to do next.