Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Migrate to PostgreSQL #54

Merged
merged 16 commits into from
Aug 8, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 13 additions & 5 deletions .github/workflows/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,11 @@ on:
image:
description: "Resulting docker image (tagged with commit SHA)"
value: ${{ jobs.set_image_name.outputs.image }}

workflow_dispatch:
inputs:
tag:
type: string
description: Image tag
env:
# Use docker.io for Docker Hub if empty
REGISTRY: ghcr.io
Expand All @@ -20,13 +24,16 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v3
with:
ref: ${{ github.event.pull_request.head.sha }}

# Workaround: https://github.com/docker/build-push-action/issues/461
- name: Setup Docker buildx
uses: docker/setup-buildx-action@79abd3f86f79a9d68a23c75a09a9a85889262adf

- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

# Login against a Docker registry except on PR
# https://github.com/docker/login-action
- name: Log into registry ${{ env.REGISTRY }}
Expand Down Expand Up @@ -56,8 +63,9 @@ jobs:
uses: docker/build-push-action@ac9327eae2b366085ac7f6a2d02df8aa8ead720a
with:
context: .
push: true
tags: ${{ steps.meta.outputs.tags }}
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ github.event.inputs.tag || steps.meta.outputs.tags }}
platforms: linux/amd64,linux/arm64
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha
cache-to: type=gha,mode=max
Expand Down
4 changes: 0 additions & 4 deletions .github/workflows/checks.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,6 @@ name: Checks

on:
pull_request_target:
branches:
- "main"

defaults:
run:
Expand All @@ -29,8 +27,6 @@ jobs:
steps:
- name: "Checkout"
uses: actions/checkout@v3
with:
ref: ${{ github.event.pull_request.head.sha }}
- run: "make lint/${{ matrix.linter }}"

test:
Expand Down
6 changes: 3 additions & 3 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,15 +1,15 @@
FROM python:3.9.14-alpine3.16
FROM python:3.12.1-alpine3.19

ENV POETRY_INSTALLER_MAX_WORKERS=1
ENV POETRY_VIRTUALENVS_IN_PROJECT=false
ENV POETRY_VIRTUALENVS_PATH="/root/.venvs"
ENV VENV_PATH="${POETRY_VIRTUALENVS_PATH}/decky-plugin-store-9TtSrW0h-py3.9"
ENV VENV_PATH="${POETRY_VIRTUALENVS_PATH}/decky-plugin-store-9TtSrW0h-py3.12"

RUN apk add build-base
RUN apk add openssl-dev
RUN apk add python3-dev
RUN apk add curl libffi-dev \
&& curl -sSL https://install.python-poetry.org | python - --version 1.3.1 \
&& curl -sSL https://install.python-poetry.org | python - --version 1.7.1 \
&& apk del curl libffi-dev

ENV PATH="$POETRY_HOME/bin:$VENV_PATH/bin:/root/.local/bin:$PATH"
Expand Down
6 changes: 3 additions & 3 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -34,10 +34,10 @@ migrations/create:
alembic revision

dc/build:
docker-compose -f docker-compose.local.yml build
docker compose -f docker-compose.local.yml build

dc/%:
docker-compose -f docker-compose.local.yml run -w /app plugin_store make $*
docker compose -f docker-compose.local.yml run -w /app plugin_store make $*

deps/lock:
poetry lock --no-update
Expand All @@ -46,4 +46,4 @@ deps/upgrade:
poetry lock

test:
pytest ./tests
SQLALCHEMY_WARN_20=1 pytest ./tests
14 changes: 11 additions & 3 deletions docker-compose.local.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ services:
build: .
container_name: plugin_store
environment:
- DB_PATH=/app/.database/plugin_store_new.db
- DB_URL
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We probably want this DB_URL specified for local dev, as the postgres instance starts in the same dockerfile anyway.

- ANNOUNCEMENT_WEBHOOK
- SUBMIT_AUTH_KEY=deadbeef
- B2_APP_KEY_ID
Expand All @@ -20,7 +20,15 @@ services:
redis_db:
image: redis:latest
restart: unless-stopped
ports:
- "6379:6379"
environment:
- REDIS_PORT=6379

postgres_db:
image: postgres:16 # Postgres databases are only compatible with their same major version
restart: unless-stopped
environment:
- POSTGRES_DB=decky
- POSTGRES_USER=decky
- POSTGRES_PASSWORD=decky
volumes:
- ../store-postgres:/var/lib/postgresql/data
18 changes: 12 additions & 6 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,26 +4,32 @@ services:
build: .
container_name: "${DEPLOYMENT_NAME}"
environment:
- DB_PATH
- DB_URL
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, same here. We can build it from POSTGRES_PASSWORD to not have it hardcoded here.

- ANNOUNCEMENT_WEBHOOK
- SUBMIT_AUTH_KEY
- B2_APP_KEY_ID
- B2_APP_KEY
- B2_BUCKET_ID
volumes:
- ~/database:/db
networks:
- plugins-network
- default
restart: unless-stopped

redis_db:
image: redis:latest
restart: unless-stopped
networks:
- plugins-network
environment:
- REDIS_PORT=6379


postgres_db:
image: postgres:16 # Postgres databases are only compatible with their same major version
restart: unless-stopped
environment:
- POSTGRES_DB=decky
- POSTGRES_USER=decky
- POSTGRES_PASSWORD
volumes:
- ${DB_PATH}:/var/lib/postgresql/data

networks:
plugins-network:
Expand Down
2 changes: 1 addition & 1 deletion plugin_store/api/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ async def plugins_list(
tags: list[str] = fastapi.Query(default=[]),
hidden: bool = False,
sort_by: Optional[SortType] = None,
sort_direction: SortDirection = SortDirection.desc,
sort_direction: SortDirection = SortDirection.ASC,
db: "Database" = Depends(database),
):
tags = list(filter(None, reduce(add, (el.split(",") for el in tags), [])))
Expand Down
2 changes: 1 addition & 1 deletion plugin_store/cdn.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ async def _b2_upload(filename: str, binary: "bytes", mime_type: str = "b2/x-auto
headers={"Authorization": f"Basic: {b64encode(auth_str).decode('utf-8')}"},
) as res:
if not res.status == 200:
getLogger().error("B2 LOGIN ERROR " + await res.read())
getLogger().error(f"B2 LOGIN ERROR {await res.read()!r}")
return
res_data = await res.json()

Expand Down
10 changes: 5 additions & 5 deletions plugin_store/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@


class SortDirection(Enum):
desc = "desc"
asc = "asc"
DESC = "desc"
ASC = "asc"


class SortType(Enum):
name = "name"
date = "date"
downloads = "downloads"
NAME = "name"
DATE = "date"
DOWNLOADS = "downloads"
18 changes: 10 additions & 8 deletions plugin_store/database/database.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
from sqlalchemy.exc import NoResultFound, SQLAlchemyError
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from sqlalchemy.orm import sessionmaker
from sqlalchemy.sql import collate, delete, select, update
from sqlalchemy.sql import delete, select, update

from constants import SortDirection, SortType

Expand All @@ -29,7 +29,7 @@


async_engine = create_async_engine(
f"sqlite+aiosqlite:///{getenv('DB_PATH')}",
getenv("DB_URL"),
pool_pre_ping=True,
# echo=settings.ECHO_SQL,
)
Expand Down Expand Up @@ -155,7 +155,7 @@ async def search(
tags: "Iterable[str] | None" = None,
include_hidden: "bool" = False,
sort_by: Optional[SortType] = None,
sort_direction: SortDirection = SortDirection.desc,
sort_direction: SortDirection = SortDirection.DESC,
limit: int = 50,
page: int = 0,
) -> list["Artifact"]:
Expand All @@ -168,17 +168,19 @@ async def search(
if not include_hidden:
statement = statement.where(Artifact.visible.is_(True))

if sort_direction == SortDirection.asc:
if sort_direction == SortDirection.ASC:
direction = asc
else:
direction = desc

if sort_by == SortType.name:
statement = statement.order_by(direction(collate(Artifact.name, "NOCASE")))
elif sort_by == SortType.date:
if sort_by == SortType.NAME:
statement = statement.order_by(direction(Artifact.name))
elif sort_by == SortType.DATE:
statement = statement.order_by(direction(Artifact.created))
elif sort_by == SortType.downloads:
elif sort_by == SortType.DOWNLOADS:
statement = statement.order_by(direction(Artifact.downloads))
else:
statement = statement.order_by(direction(Artifact.id))

result = (await session.execute(statement)).scalars().all()
return result or []
Expand Down
4 changes: 2 additions & 2 deletions plugin_store/database/migrations/env.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ def run_migrations_offline() -> None:

"""
context.configure(
url=f"sqlite+aiosqlite:///{getenv('DB_PATH')}",
url=getenv("DB_URL"),
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
Expand All @@ -68,7 +68,7 @@ async def run_migrations_online() -> None:

"""
connectable = create_async_engine(
f"sqlite+aiosqlite:///{getenv('DB_PATH')}",
getenv("DB_URL"),
poolclass=pool.NullPool,
future=True,
)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
def upgrade() -> None:

conn = op.get_bind()
statement = sa.select(Tag.tag, sa.func.group_concat(Tag.id, ",").label("ids")).group_by(Tag.tag)
statement = sa.select(Tag.tag, sa.func.string_agg(str(Tag.id), ",").label("ids")).group_by(Tag.tag)
tags = conn.execute(statement)
replacements = {(ids[0], tag.tag): ids[1:] for tag in tags if len(ids := sorted(map(int, tag.ids.split(",")))) > 1}
for (dest_id, tag), src_ids in replacements.items():
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,25 +18,31 @@

def upgrade() -> None:
with op.batch_alter_table("artifacts") as batch_op_artifacts:
batch_op_artifacts.alter_column( # type: ignore[attr-defined]
"visible", existing_type=sa.BOOLEAN(), nullable=True, existing_server_default=sa.text("'1'")
batch_op_artifacts.alter_column(
"visible",
existing_type=sa.BOOLEAN(),
nullable=True,
existing_server_default=sa.text("'1'"), # type: ignore[arg-type]
)
with op.batch_alter_table("versions") as batch_op_versions:
batch_op_versions.add_column(sa.Column("file", sa.Text(), nullable=True)) # type: ignore[attr-defined]
batch_op_versions.create_unique_constraint( # type: ignore[attr-defined]
batch_op_versions.add_column(sa.Column("file", sa.Text(), nullable=True))
batch_op_versions.create_unique_constraint(
"unique_version_artifact_id_name",
["artifact_id", "name"],
)


def downgrade() -> None:
with op.batch_alter_table("versions") as batch_op_versions:
batch_op_versions.drop_constraint( # type: ignore[attr-defined]
batch_op_versions.drop_constraint(
"unique_version_artifact_id_name",
type_="unique",
)
batch_op_versions.drop_column("versions", "file") # type: ignore[attr-defined]
batch_op_versions.drop_column("file")
with op.batch_alter_table("artifacts") as batch_op_artifacts:
batch_op_artifacts.alter_column( # type: ignore[attr-defined]
"artifacts", "visible", existing_type=sa.BOOLEAN(), nullable=False, existing_server_default=sa.text("'1'")
batch_op_artifacts.alter_column(
"visible",
existing_type=sa.BOOLEAN(),
nullable=False,
existing_server_default=sa.text("'1'"), # type: ignore[arg-type]
)
1 change: 1 addition & 0 deletions plugin_store/database/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ class TZDateTime(TypeDecorator):
"""

impl = DateTime(timezone=True)
cache_ok = True

def process_bind_param(self, value: "datetime | None", dialect: "Dialect"):
if isinstance(value, datetime):
Expand Down
Loading
Loading