Skip to content

Commit

Permalink
Merge branch 'main' into 245-unify-terminology
Browse files Browse the repository at this point in the history
  • Loading branch information
La-Cezanne authored Aug 27, 2023
2 parents d8a563c + 3084368 commit e37670f
Show file tree
Hide file tree
Showing 47 changed files with 3,574 additions and 641 deletions.
51 changes: 40 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ python manage.py migrate --settings=config.settings.local

# (optional) install example data
python manage.py loaddata --settings=config.settings.local e2e_tests/database/test_database.json
cp -r e2e_tests/database/test_database_uploads cpmonitor/images/uploads
cp -r e2e_tests/database/test_database_uploads/. cpmonitor/images/uploads

# install css and javascript libraries
yarn install
Expand Down Expand Up @@ -119,7 +119,7 @@ pytest --ignore e2e_tests/test_deployed.py
rm db/db.sqlite3
poetry run python manage.py migrate --settings=config.settings.local
poetry run python manage.py loaddata --settings=config.settings.local e2e_tests/database/test_database.json
cp -r e2e_tests/database/test_database_uploads cpmonitor/images/uploads
cp -r e2e_tests/database/test_database_uploads/. cpmonitor/images/uploads
docker compose up -d --build
docker compose -f docker/reverseproxy/docker-compose.yml up -d --build
pytest e2e_tests/test_deployed.py
Expand All @@ -142,11 +142,17 @@ pytest --headed <path-to-e2e-test>
From a local database filled with suitable data, generate a fixture named `example_fixture` with

```shell
python -Xutf8 manage.py dumpdata cpmonitor -e contenttypes -e admin.logentry -e sessions --indent 2 --settings=config.settings.local > cpmonitor/fixtures/example_fixture.json
python -Xutf8 manage.py dumpdata -e contenttypes -e auth.Permission -e admin.LogEntry -e sessions --indent 2 --settings=config.settings.local > cpmonitor/fixtures/example_fixture.json
```

(The `-Xutf8` and `--indent 2` options ensure consistent and readable output on all platforms.)

The arguments `-e contenttypes -e auth.Permission -e admin.LogEntry -e sessions` exclude tables which are pre-filled
by django or during usage by django and whose content may change depending on the models in the project. If they are
included, everything works fine at first, since loaddata will silently accept data already there. However, as soon as
the data to load clashes with existing content, it will fail. `-e admin.LogEntry` excludes references to content types
which may otherwise be inconsistent.`-e sessions` excludes unneeded data which otherwise would clog the JSON file.

This fixture may be loaded in a test with. (Similar in a pytest fixture.)

```python
Expand Down Expand Up @@ -241,7 +247,7 @@ Afterwards the test database has to be updated as well. Use the dumpdata command
currently running database:

```shell
python -Xutf8 manage.py dumpdata -e contenttypes -e admin.logentry -e sessions --indent 2 --settings=config.settings.local > e2e_tests/database/test_database.json
python -Xutf8 manage.py dumpdata -e contenttypes -e auth.Permission -e admin.LogEntry -e sessions --indent 2 --settings=config.settings.local > e2e_tests/database/test_database.json
```

Cheat-sheet to make sure the correct data is dumped:
Expand All @@ -251,10 +257,10 @@ git checkout right-before-model-change
rm db/db.sqlite3
python manage.py migrate --settings=config.settings.local
python manage.py loaddata --settings=config.settings.local e2e_tests/database/test_database.json
cp -r e2e_tests/database/test_database_uploads cpmonitor/images/uploads
cp -r e2e_tests/database/test_database_uploads/. cpmonitor/images/uploads
git checkout after-model-change-including-migration
python manage.py migrate --settings=config.settings.local
python -Xutf8 manage.py dumpdata -e contenttypes -e admin.logentry -e sessions --indent 2 --settings=config.settings.local > e2e_tests/database/test_database.json
python -Xutf8 manage.py dumpdata -e contenttypes -e auth.Permission -e admin.LogEntry -e sessions --indent 2 --settings=config.settings.local > e2e_tests/database/test_database.json
# Only if additional images were uploaded:
cp -r cpmonitor/images/uploads e2e_tests/database/test_database_uploads
```
Expand Down Expand Up @@ -371,7 +377,7 @@ Possibly migrate, test the data, and check that the size is reasonable. Then mak

```sh
SNAPSHOT_NAME=prod_database_$(date -u +"%FT%H%M%SZ")
python -Xutf8 manage.py dumpdata -e contenttypes -e admin.logentry -e sessions --indent 2 --settings=config.settings.local > e2e_tests/database/${SNAPSHOT_NAME}.json
python -Xutf8 manage.py dumpdata -e contenttypes -e auth.Permission -e admin.LogEntry -e sessions --indent 2 --settings=config.settings.local > e2e_tests/database/${SNAPSHOT_NAME}.json
cp -r cpmonitor/images/uploads e2e_tests/database/${SNAPSHOT_NAME}_uploads
echo "Some useful information, e.g. the migration state of the snapshot" > e2e_tests/database/${SNAPSHOT_NAME}.README
du -hs e2e_tests/database/${SNAPSHOT_NAME}*
Expand Down Expand Up @@ -415,7 +421,7 @@ docker compose up -d
```
5. Copy the images, the compose files, the certificate renewal cron job and the reverse proxy settings to the server:
```sh
scp -C cpmonitor.tar klimaschutzmonitor-dbeaver.tar docker-compose.yml crontab renew-cert.sh docker/reverseproxy/ [email protected]:/tmp/
scp -C cpmonitor.tar klimaschutzmonitor-dbeaver.tar docker-compose.yml crontab reload-cert.sh docker/reverseproxy/ [email protected]:/tmp/
```
6. Login to the server:
```sh
Expand Down Expand Up @@ -465,8 +471,8 @@ docker compose up -d
11. Install certificate renewal cron job:
```sh
crontab /tmp/crontab
cp /tmp/renew-cert.sh /home/monitoring/
chmod +x /home/monitoring/renew-cert.sh
cp /tmp/reload-cert.sh /home/monitoring/
chmod +x /home/monitoring/reload-cert.sh
```
### Database Client
Expand All @@ -478,13 +484,16 @@ the environment) and the credentials can be found in the .env.local file. For te
configured in the respective .env files on the server.
### TLS Certificate and Renewal
#### Overview
We currently use a single TLS certificate for both monitoring.localzero.org and monitoring-test.localzero.org. The certificate is issued by letsencrypt.org and requesting and renewal is performed using [acme.sh](https://github.com/acmesh-official/acme.sh), which runs in a container. This solution allows us to have almost all necessary code and config in the repo instead of only on the server.
#### Initial Issuance
The initial certificate was issued using the following command:
```sh
docker exec acme-sh --issue -d monitoring-test.localzero.net -d monitoring.localzero.net --standalone --server https://acme-v02.api.letsencrypt.org/directory --fullchain-file /acme.sh/fullchain.cer --key-file /acme.sh/ssl-cert.key
```
#### Renewal
Renewal is performed automatically by acme.sh's internal cron job, which...
- checks if a renewal is necessary, and if so:
- requests a new certificate from letsencrypt,
Expand All @@ -495,6 +504,26 @@ A reload of the nginx config is independently triggered every four hours by our
```sh
crontab -l
```
This job runs [a script](renew-cert.sh) which applies the latest certificate that acme.sh has produced. This means there can be some delay between renewal and application of the certificate, but since acme.sh performs renewal a few days before expiry, there should be enough time for nginx to reload the certificate.
This job runs [a script](reload-cert.sh) which applies the latest certificate that acme.sh has produced. This means there can be some delay between renewal and application of the certificate, but since acme.sh performs renewal a few days before expiry, there should be enough time for nginx to reload the certificate.

#### acme-sh Configuration and Debugging

The configuration used by acme-sh's cronjob (not our nginx reload cronjob!), e.g. renewal interval, can be changed in `reverseproxy/ssl_certificates/monitoring-test.localzero.net_ecc/`` on the server.
The following commands might be executed on the server to debug and test the acme-sh configuration:
```shell
# view certificate creation date and next renew date
docker exec acme-sh --list
# tell acme-sh to run its cronjob now, using letsencrypt's test environment (to bypass rate limiting)
docker exec acme-sh --cron --staging

# tell acme-sh to run its cronjob now, using letsencrypt's PROD environment (affected by rate limiting - 5 certs every couple weeks...)
docker exec acme-sh --cron

# force a renewal via letsencrypt's PROD environment, even if renewal time hasn't been reached yet
docker exec acme-sh --cron --force
```
#### TLS Certificates and Running Locally
When running locally, we instead use a [certificate created for localhost](ssl_certificates_localhost). Since ownership of localhost cannot be certified, this is a single self-signed certificate instead of a full chain signed by a CA like on the server, and an exception must be added to your browser to trust it.
5 changes: 5 additions & 0 deletions config/nginx/conf.d/nginx.conf
Original file line number Diff line number Diff line change
@@ -1,11 +1,16 @@
# nginx config for deployment on server, including SSL/TLS setup

server_tokens off;

server {
listen 8080;

server_name monitoring.localzero.net monitoring-test.localzero.net;

client_max_body_size 100m;
add_header Content-Security-Policy "default-src 'self'";
add_header Strict-Transport-Security "max-age=63072000; includeSubDomains; preload";
add_header Cache-Control "no-store";

location / {
# pass requests for dynamic content to gunicorn
Expand Down
43 changes: 42 additions & 1 deletion config/settings/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,11 @@ def get_env(var: str) -> str:
"django.contrib.staticfiles",
"treebeard",
"martor",
"rules.apps.AutodiscoverRulesConfig",
"allauth",
"allauth.account",
"allauth.socialaccount",
# "invitations", We do not use invitations.Invitation and therefore do not want its migrations.
"cpmonitor.apps.CpmonitorConfig",
]

Expand All @@ -60,12 +65,17 @@ def get_env(var: str) -> str:
"django.middleware.clickjacking.XFrameOptionsMiddleware",
]

AUTHENTICATION_BACKENDS = (
"rules.permissions.ObjectPermissionBackend",
"django.contrib.auth.backends.ModelBackend",
)

ROOT_URLCONF = "cpmonitor.urls"

TEMPLATES = [
{
"BACKEND": "django.template.backends.django.DjangoTemplates",
"DIRS": [],
"DIRS": [BASE_DIR / "cpmonitor" / "templates" / "overrides"],
"APP_DIRS": True,
"OPTIONS": {
"context_processors": [
Expand Down Expand Up @@ -157,3 +167,34 @@ def get_env(var: str) -> str:
MARTOR_UPLOAD_PATH = "uploads/"
MARTOR_UPLOAD_URL = "/api/uploader/"
MAX_IMAGE_UPLOAD_SIZE = 104857600 # 100 MB

# django-allauth configuration:
# https://django-allauth.readthedocs.io/en/latest/configuration.html
# Most customization is done in the adapter:
ACCOUNT_ADAPTER = "cpmonitor.adapters.AllauthInvitationsAdapter"
# django-allauth needs allauth.socialaccount to really work, but we don't use its OAuth parts
SOCIALACCOUNT_PROVIDERS = {}
ACCOUNT_EMAIL_VERIFICATION = "none" # Would need a working email config.

# django-invitations configuration:
# https://django-invitations.readthedocs.io/en/latest/configuration.html
# django-invitations is closely coupled to django-allauth and uses
# the same adapter.
INVITATIONS_ADAPTER = "cpmonitor.adapters.AllauthInvitationsAdapter"
# To couple an invitation to a city and access right (either admin or editor)
# a custom model is needed. To prevent `invitations.Invitation` from being
# used, django-invitations does not appear in `INSTALLED_APPS` above.
INVITATIONS_INVITATION_MODEL = "cpmonitor.Invitation"
INVITATIONS_GONE_ON_ACCEPT_ERROR = False
INVITATIONS_INVITATION_ONLY = True
# In order to use our custom view instead of the one from django-invitations,
# `invitations.urls` could not be used. This parameters default value points
# to that and had to be replaced by our own:
INVITATIONS_CONFIRMATION_URL_NAME = "accept-invite"
# Setting this to true would cause a signal handler to be installed, which
# would try to access the email field of an invitation and fail on the custom model.
INVITATIONS_ACCEPT_INVITE_AFTER_SIGNUP = False

# django core configuration used by django-invitations, django-allauth
LOGIN_URL = "/admin/login/"
LOGIN_REDIRECT_URL = "/admin/"
3 changes: 3 additions & 0 deletions config/settings/container.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,3 +11,6 @@
CSRF_TRUSTED_ORIGINS = get_env("DJANGO_CSRF_TRUSTED_ORIGINS").split(",")
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = get_env("DJANGO_DEBUG") == "True"
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
SESSION_COOKIE_SECURE = True
CSRF_COOKIE_SECURE = True
46 changes: 46 additions & 0 deletions cpmonitor/adapters.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
from allauth.account.adapter import DefaultAccountAdapter
from invitations.app_settings import app_settings

from .models import AccessRight, City, get_invitation


class AllauthInvitationsAdapter(DefaultAccountAdapter):
def is_open_for_signup(self, request):
"""
Overwrites django-invitations.
Checks that there exists an invitation instead of email.
"""
if get_invitation(request):
return True
elif app_settings.INVITATION_ONLY:
return False
else:
return True

def save_user(self, request, user, form, commit=True):
"""
Overwrites django-allauth.
Check there is an invitation and set the appropriate access rights.
Swallow the user object already created, if not.
Otherwise, set access rights according to invitation.
"""
invitation = get_invitation(request)
if not invitation:
self.add_error(
None,
"Die Registrierung ist nur möglich über einen gültigen Einladungslink.",
)
return

user.is_staff = True
user = super().save_user(request, user, form, commit)

city: City = invitation.city
if invitation.access_right == AccessRight.CITY_EDITOR:
city.city_editors.add(user)
city.save()
elif invitation.access_right == AccessRight.CITY_ADMIN:
city.city_admins.add(user)
city.save()

return user
Loading

0 comments on commit e37670f

Please sign in to comment.