Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update README.md with docker using celery and redis #869

Merged
merged 1 commit into from
May 16, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
116 changes: 116 additions & 0 deletions deployment_tools/docker/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,3 +64,119 @@ You'll still need to set the dashboard password:
- Using a modern IDE, you can also directly edit the files inside the containers - and even debug them!


## Combining py4web with Celery and Redis

Celery can be used for handling long running background tasks.
Celery uses Redis as broker. Redis will be run in a seperate container.

At the end of the docker-compose file add:

redis:
restart: always
image: redis
ports:
- "6379:6379"

In the app (based on _scaffold) settings.py:

#Celery settings
USE_CELERY = True
CELERY_BROKER = "redis://redis:6379/0"
CELERY_BACKEND = "redis://redis:6379/0"

In common.py:

# #######################################################
# Optionally configure celery
# #######################################################
if settings.USE_CELERY:
from celery import Celery

# to use "from .common import scheduler" and then use it according
# to celery docs, examples in tasks.py
scheduler = Celery(
"apps.%s.tasks" % settings.APP_NAME, broker=settings.CELERY_BROKER, backend=settings.CELERY_BACKEND)
scheduler.conf.broker_connection_retry_on_startup = True

In the docker file use entrypoint.sh to get the scheduler going and start py4web.

entrypoint.sh:

#!/bin/bash
. /home/py4web/.venv/bin/activate
exec .venv/bin/celery -A apps.myapp.tasks beat &
exec .venv/bin/celery -A apps.myapp.tasks worker --loglevel=info &
exec py4web run --password_file password.txt --host 0.0.0.0 --port 8000 apps

complete Dockerfile:

FROM ubuntu:latest

ARG user=py4web
ENV PY4WEB_ROOT=/home/$user

RUN apt update && \
apt install -y git locales locales-all python3.12 python3-pip python3.12-venv memcached && \
service memcached restart && \
groupadd -r $user && \
useradd -m -r -g $user $user && \
python3 -m venv $PY4WEB_ROOT/.venv && \
. $PY4WEB_ROOT/.venv/bin/activate && \
python3 -m pip install -U py4web psycopg2-binary && \
python3 -m pip install -U "celery[redis]"

ENV LC_ALL en_US.UTF-8
ENV LANG en_US.UTF-8
ENV LANGUAGE en_US.UTF-8

USER $user

RUN . $PY4WEB_ROOT/.venv/bin/activate && \
cd $PY4WEB_ROOT/ && py4web setup --yes apps
# use ./venv/bin/py4web set_password
COPY password.txt $PY4WEB_ROOT/.

EXPOSE 8000

WORKDIR $PY4WEB_ROOT/
COPY entrypoint.sh /usr/local/bin/
ENTRYPOINT [ "entrypoint.sh" ]

docker-compose.yml

services:

web:
build: .
ports:
- "8000:8000"
environment:
- PYDAL_URI=postgres://foo:bar@postgres:5432/baz
- PYDAL_URI2=mysql://root:secret@localhost/ursadina_gtd
volumes:
- ./apps:/home/py4web/apps
stdin_open: true
tty: true
depends_on:
- postgres
- redis

postgres:
restart: always
image: postgres
environment:
- POSTGRES_USER=foo
- POSTGRES_PASSWORD=bar
- POSTGRES_DB=baz
- POSTGRES_PORT=5432
ports:
- "5432:5432"
volumes:
- ./data/postgres:/var/lib/postgresql/data
redis:
restart: always
image: redis
ports:
- "6379:6379"


Loading