Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ahoy containers #15

Merged
merged 15 commits into from
Mar 24, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ __pycache__/
management/
management.py
tests/resources/
db-pgdata/

# Elastic Beanstalk Files
.elasticbeanstalk/
Expand Down
35 changes: 35 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
FROM python:3.8.8-slim-buster


ARG PROJECT_NAME=${PROJECT_NAME:-kps}
ARG HOST_UID=${HOST_UID:-9000}
ARG HOST_USER=${HOST_USER:-app}

ENV HOST_HOME=/home/$HOST_USER
ENV APP_DIR=$HOST_HOME/$PROJECT_NAME
ENV PATH $HOST_HOME/.local/bin:$PATH

# Create a user specifically for app running
# Sets them with enough permissions in its home dir
RUN adduser --home $HOST_HOME --uid $HOST_UID $HOST_USER --quiet --system --group \
&& chown -R $HOST_UID:$HOST_UID $HOST_HOME/ \
&& chmod -R 770 $HOST_HOME \
&& chmod g+s $HOST_HOME

# Switches to created user
USER $HOST_UID

# Creates an app dir
RUN mkdir $APP_DIR
WORKDIR $APP_DIR

# Copies and installs requirements
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

# Finishes copying code
COPY . .

EXPOSE 5000

CMD ["sh", "docker/entrypoint.sh"]
32 changes: 30 additions & 2 deletions Makefile
Original file line number Diff line number Diff line change
@@ -1,8 +1,16 @@
.phony:
test

PROJECT_NAME=$(notdir $(PWD))
HOST_NAME=${USER}
CONTAINER_UID=$(HOST_NAME)_${PROJECT_NAME}
export PROJECT_NAME $(PROJECT_NAME)

tests:
# Sanity check & removal of idle postgres images
IDLE_CONTAINERS = $(shell docker ps -aq -f name=postgres -f name=web)
UP_CONTAINERS = $(shell docker ps -q -f name=postgres -f name=web)

test-local:
@echo "*** `tests` directory should exist at project root. Stop."

db-migration:
Expand All @@ -18,5 +26,25 @@ test-integration:
test-e2e:
pytest --color=yes --showlocals --tb=short -v tests/auth/e2e

test: tests db-migration test-unit test-integration test-e2e
test-local: tests db-migration test-unit test-integration test-e2e

build:
@docker-compose build

test:
@docker-compose -p $(CONTAINER_UID) run --rm --use-aliases --service-ports web sh docker/test.sh
@docker kill $(PROJECT_NAME)_postgres
@docker rm $(PROJECT_NAME)_postgres

clean:
@docker-compose -p $(CONTAINER_UID) down --remove-orphans 2>/dev/null
@[ ! -z "$(UP_CONTAINERS)" ] && docker kill $(UP_CONTAINERS) || echo "Neat."
@[ ! -z "$(IDLE_CONTAINERS)" ] && docker rm $(IDLE_CONTAINERS) || echo "Clean."

service:
@docker-compose -p $(CONTAINER_UID) up

prune:
docker system prune -af


47 changes: 39 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,34 +33,65 @@ This is project's in its early stages, and should receive a big WIP tag. We shou

## Instructions

As it is disclaimed the project current status, running *for now* means making sure tests pass.
We are shortly improving the entire installation experience and usage. Hold tight.
There are two ways to run this: i. Locally, through a virtual pyenv; ii. Using `docker-compose`.

### Step 1: Dependencies & environment
We like `Makefile` interface and while we also don't deliver an appropriate and fancy CLI, commands will be handled there.
To find out which commands are available, `cat Makefile`.

This projects uses `poetry` to manage dependencies. Even though `poetry` is way too slow to run `poetry install`, we find
that for managing dependencies' version compatibility is a valuable tool.

While we don't have a fully automated build pipeline, we agree to `poetry export -f requirements.txt > requirements.txt`.

### Local docker

Make sure docker daemon is running.

### Step 1: Build image

```bash
make build
```

### Step 2: Serve it or test it

```bash
make service
make test
```

### Step 3 (Optional): Clean containers

```bash
make clean
```

### Local python

#### Step 1: Dependencies & environment

This projects uses `poetry` to manage dependencies.
Having said that, how you instantiate your virtual environment is up to you. You can do that now.

Inside your blank python virtual environment:

```shell
pip install poetry && poetry install
pip install -r requirements.txt
```

### Step 2: Prepare your database
#### Step 2: Prepare your database

As there aren't any containerization being done for now, you'd need `postgres` up and running in your local machine.

```shell
psql -c "create database template"
```

### Step 3: Test it
#### Step 3: Test it

Right now you should be able to run the entire test-suite properly.

```shell
make test
make test-local
```


Expand Down
53 changes: 53 additions & 0 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
version: "3.9"

x-common-variables: &common-variables
POSTGRES_USER: ${POSTGRES_USER:-postgres}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-postgres}
POSTGRES_DB: ${POSTGRES_DB:-template}

services:
dbpg:
container_name: ${PROJECT_NAME}_postgres
image: postgres
environment:
<<: *common-variables
PGDATA: /data/postgres
ports:
- "5432:5432"
volumes:
- ./db-pgdata:/var/lib/postgresql/data/pgdata
networks:
- backend
restart: unless-stopped
ruiconti marked this conversation as resolved.
Show resolved Hide resolved

web:
environment:
<<: *common-variables
POSTGRES_HOST: dbpg
MAX_CONCURRENCY: 1
HOST: "0.0.0.0"
PORT: "5000"
container_name: ${PROJECT_NAME}_web
image: kms:test
build:
context: .
args:
- HOST_UID=${HOST_UID:-9000}
- HOST_USER=${HOST_USER:-app}
ports:
- "5000:5000"
depends_on:
- dbpg
volumes:
- ./:/app
networks:
- backend
restart: unless-stopped

ruiconti marked this conversation as resolved.
Show resolved Hide resolved

networks:
backend:
driver: bridge

volumes:
db-pgdata:
2 changes: 2 additions & 0 deletions docker/entrypoint.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
#!/bin/bash
sh docker/uvicorn.sh
9 changes: 9 additions & 0 deletions docker/init-user-db.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
#!/bin/bash

set -e

psql -v ON_ERROR_STOP=1 --username "$POSTGRES_USER" --dbname "$POSTGRES_DB" <<-EOSQL
CREATE USER tester;
CREATE DATABASE court;
GRANT ALL PRIVILEGES ON DATABASE court TO tester;
EOSQL
2 changes: 2 additions & 0 deletions docker/migration.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
alembic -x data=true downgrade base
alembic -x data=true upgrade head
5 changes: 5 additions & 0 deletions docker/test.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
#!/bash/sh
sh docker/migration.sh
pytest --color=yes --showlocals --tb=short -v tests/auth/unit
pytest --color=yes --showlocals --tb=short -v tests/auth/integration
pytest --color=yes --showlocals --tb=short -v tests/auth/e2e
1 change: 1 addition & 0 deletions docker/uvicorn.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
uvicorn src.server.app:app --port $PORT --host $HOST --loop uvloop --log-level info --workers $MAX_CONCURRENCY
2 changes: 1 addition & 1 deletion migrations/env.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
from alembic import context

from src import config as config_app
from src.federation import init
from src.federation import init

# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
Expand Down
15 changes: 14 additions & 1 deletion src/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,20 @@
from typing import Any

POSTGRES_URI_TEMPLATE = "postgresql://{}:{}@{}:{}/{}"
POSTGRES_DEFAULT = ("postgres", "", "localhost", 5432, "template")

PG_USER = os.environ.get("POSTGRES_USER", "postgres")
PG_PASSWORD = os.environ.get("POSTGRES_PASSWORD", "postgres")
PG_HOST = os.environ.get("POSTGRES_HOST", "localhost")
PG_PORT = os.environ.get("POSTGRES_PORT", 5432)
PG_DB = os.environ.get("POSTGRES_DB", "template")

POSTGRES_DEFAULT = (
PG_USER,
PG_PASSWORD,
PG_HOST,
PG_PORT,
PG_DB
)


def default_user() -> tuple:
Expand Down
15 changes: 2 additions & 13 deletions src/core/ports/unit_of_work.py
Original file line number Diff line number Diff line change
@@ -1,20 +1,9 @@
import abc
from typing import Callable, Generator

from sqlalchemy import create_engine, orm

from src import config
from src import orm
from src.core.ports import repository

DEFAULT_SESSION_FACTORY = orm.sessionmaker(
bind=create_engine(
# ISOLATION LEVEL ENSURES aggregate's version IS RESPECTED
# That is, if version differs it will raise an exception
config.get_postgres_uri(),
isolation_level="REPEATABLE_READ",
),
autoflush=False,
)


class AbstractUnitOfWork(abc.ABC):
Expand Down Expand Up @@ -50,7 +39,7 @@ def rollback(self) -> None:
class SqlAlchemyUnitOfWork(AbstractUnitOfWork):
session: orm.Session

def __init__(self, session_factory: Callable = DEFAULT_SESSION_FACTORY):
def __init__(self, session_factory: Callable = orm.DEFAULT_SESSION_FACTORY):
self.session_factory: Callable = session_factory

def __exit__(self, *args): # type: ignore
Expand Down
14 changes: 13 additions & 1 deletion src/orm.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,18 @@
from sqlalchemy import MetaData
from sqlalchemy.orm import Session
from sqlalchemy import create_engine, orm, MetaData

from src import config
import src.auth.adapters.orm

DEFAULT_SESSION_FACTORY: Session = orm.sessionmaker(
bind=create_engine(
# ISOLATION LEVEL ENSURES aggregate's version IS RESPECTED
# That is, if version differs it will raise an exception
config.get_postgres_uri(),
isolation_level="REPEATABLE_READ",
),
autoflush=False,
)

def start_mappers() -> MetaData:
"""
Expand Down