Skip to content

Commit

Permalink
Merge branch 'master' into patch-3
Browse files Browse the repository at this point in the history
  • Loading branch information
minisbett authored Mar 1, 2024
2 parents d645220 + ae42bf6 commit 4b87653
Show file tree
Hide file tree
Showing 78 changed files with 3,389 additions and 3,521 deletions.
3 changes: 3 additions & 0 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,7 @@ PP_CACHED_ACCS=90,95,98,99,100
DISALLOWED_NAMES=mrekk,vaxei,btmc,cookiezi
DISALLOWED_PASSWORDS=password,abc123
DISALLOW_OLD_CLIENTS=True
DISALLOW_INGAME_REGISTRATION=True

DISCORD_AUDIT_LOG_WEBHOOK=

Expand All @@ -62,6 +63,8 @@ DISCORD_AUDIT_LOG_WEBHOOK=
# for debugging & development purposes.
AUTOMATICALLY_REPORT_PROBLEMS=False

LOG_WITH_COLORS=False

# XXX: Uncomment this if you have downloaded the database from maxmind.
# Change the path to the .mmdb file you downloaded, uncomment here and in docker-compose.yml
# You can download the database here: https://dev.maxmind.com/geoip/geolite2-free-geolocation-data
Expand Down
10 changes: 9 additions & 1 deletion .github/docs/wiki/Setting-up.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,15 +17,23 @@ sudo apt install -y docker docker-compose
## configuring bancho.py

all configuration for the osu! server (bancho.py) itself can be done from the
`.env` file. we provide an example `.env.example` file which you can use as a base.
`.env` and `logging.yaml` files. we will provide example files for each, which
you can use as a base and modify as you'd like.

```sh
# create a configuration file from the sample provided
cp .env.example .env

# create a logging configuration file from the sample provided
cp logging.yaml.example logging.yaml

# configure the application to your needs
# this is required to move onto the next steps
nano .env

# you can additionally configure the logging if you'd like,
# but the default should work fine for most users.
nano logging.yaml
```

## configuring a reverse proxy (we'll use nginx)
Expand Down
10 changes: 9 additions & 1 deletion .github/docs/wiki/locale/de-DE/Setting-up-de-DE.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,15 +17,23 @@ sudo apt install -y docker docker-compose
## configuring bancho.py

all configuration for the osu! server (bancho.py) itself can be done from the
`.env` file. we provide an example `.env.example` file which you can use as a base.
`.env` and `logging.yaml` files. we will provide example files for each, which
you can use as a base and modify as you'd like.

```sh
# create a configuration file from the sample provided
cp .env.example .env

# create a logging configuration file from the sample provided
cp logging.yaml.example logging.yaml

# configure the application to your needs
# this is required to move onto the next steps
nano .env

# you can additionally configure the logging if you'd like,
# but the default should work fine for most users.
nano logging.yaml
```

## configuring a reverse proxy (we'll use nginx)
Expand Down
10 changes: 9 additions & 1 deletion .github/docs/wiki/locale/zh-CN/Setting-up-zh-CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,15 +17,23 @@ sudo apt install -y docker docker-compose
## configuring bancho.py

all configuration for the osu! server (bancho.py) itself can be done from the
`.env` file. we provide an example `.env.example` file which you can use as a base.
`.env` and `logging.yaml` files. we will provide example files for each, which
you can use as a base and modify as you'd like.

```sh
# create a configuration file from the sample provided
cp .env.example .env

# create a logging configuration file from the sample provided
cp logging.yaml.example logging.yaml

# configure the application to your needs
# this is required to move onto the next steps
nano .env

# you can additionally configure the logging if you'd like,
# but the default should work fine for most users.
nano logging.yaml
```

## configuring a reverse proxy (we'll use nginx)
Expand Down
7 changes: 7 additions & 0 deletions .github/workflows/test.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ env:
APP_HOST: "0.0.0.0"
APP_PORT: "10000"
AUTOMATICALLY_REPORT_PROBLEMS: "False"
LOG_WITH_COLORS: "False"
COMMAND_PREFIX: "!"
DATA_DIRECTORY: "not relevant"
DB_HOST: "mysql"
Expand Down Expand Up @@ -65,3 +66,9 @@ jobs:
- name: Stop containers
if: always()
run: docker-compose down

- name: Archive code coverage results
uses: actions/upload-artifact@v2
with:
name: code-coverage-report
path: coverage/
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -16,3 +16,6 @@ tools/cf_records.txt
/.db-data/
/.redis-data/
poetry.toml
.coverage
logging.yaml
logs.log
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ repos:
hooks:
- id: black
- repo: https://github.com/asottile/pyupgrade
rev: v3.15.0
rev: v3.15.1
hooks:
- id: pyupgrade
args: [--py311-plus, --keep-runtime-typing]
Expand Down
11 changes: 4 additions & 7 deletions Makefile
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
#!/usr/bin/env make

build:
if [ -d ".dbdata" ]; then sudo chmod -R 755 .dbdata; fi
docker build -t bancho:latest .
Expand All @@ -11,8 +13,9 @@ run-bg:
run-caddy:
caddy run --envfile .env --config ext/Caddyfile

last?=1
logs:
docker-compose logs -f bancho mysql redis
docker-compose logs -f bancho mysql redis --tail ${last}

shell:
poetry shell
Expand All @@ -21,12 +24,6 @@ test:
docker-compose -f docker-compose.test.yml up -d bancho-test mysql-test redis-test
docker-compose -f docker-compose.test.yml exec -T bancho-test /srv/root/scripts/run-tests.sh

test-local:
poetry run pytest -vv tests/

test-dbg:
poetry run pytest -vv --pdb -s tests/

lint:
poetry run pre-commit run --all-files

Expand Down
Empty file added app/adapters/__init__.py
Empty file.
164 changes: 164 additions & 0 deletions app/adapters/database.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,164 @@
from __future__ import annotations

from typing import Any
from typing import cast

from databases import Database as _Database
from databases.core import Transaction
from sqlalchemy.dialects.mysql.mysqldb import MySQLDialect_mysqldb
from sqlalchemy.sql.compiler import Compiled
from sqlalchemy.sql.expression import ClauseElement

from app import settings
from app.logging import log
from app.timer import Timer


class MySQLDialect(MySQLDialect_mysqldb):
default_paramstyle = "named"


DIALECT = MySQLDialect()

MySQLRow = dict[str, Any]
MySQLParams = dict[str, Any] | None
MySQLQuery = ClauseElement | str


class Database:
def __init__(self, url: str) -> None:
self._database = _Database(url)

async def connect(self) -> None:
await self._database.connect()

async def disconnect(self) -> None:
await self._database.disconnect()

def _compile(self, clause_element: ClauseElement) -> tuple[str, MySQLParams]:
compiled: Compiled = clause_element.compile(
dialect=DIALECT,
compile_kwargs={"render_postcompile": True},
)
return str(compiled), compiled.params

async def fetch_one(
self,
query: MySQLQuery,
params: MySQLParams = None,
) -> MySQLRow | None:
if isinstance(query, ClauseElement):
query, params = self._compile(query)

with Timer() as timer:
row = await self._database.fetch_one(query, params)

if settings.DEBUG:
time_elapsed = timer.elapsed()
log(
f"Executed SQL query: {query} {params} in {time_elapsed * 1000:.2f} msec.",
extra={
"query": query,
"params": params,
"time_elapsed": time_elapsed,
},
)

return dict(row._mapping) if row is not None else None

async def fetch_all(
self,
query: MySQLQuery,
params: MySQLParams = None,
) -> list[MySQLRow]:
if isinstance(query, ClauseElement):
query, params = self._compile(query)

with Timer() as timer:
rows = await self._database.fetch_all(query, params)

if settings.DEBUG:
time_elapsed = timer.elapsed()
log(
f"Executed SQL query: {query} {params} in {time_elapsed * 1000:.2f} msec.",
extra={
"query": query,
"params": params,
"time_elapsed": time_elapsed,
},
)

return [dict(row._mapping) for row in rows]

async def fetch_val(
self,
query: MySQLQuery,
params: MySQLParams = None,
column: Any = 0,
) -> Any:
if isinstance(query, ClauseElement):
query, params = self._compile(query)

with Timer() as timer:
val = await self._database.fetch_val(query, params, column)

if settings.DEBUG:
time_elapsed = timer.elapsed()
log(
f"Executed SQL query: {query} {params} in {time_elapsed * 1000:.2f} msec.",
extra={
"query": query,
"params": params,
"time_elapsed": time_elapsed,
},
)

return val

async def execute(self, query: MySQLQuery, params: MySQLParams = None) -> int:
if isinstance(query, ClauseElement):
query, params = self._compile(query)

with Timer() as timer:
rec_id = await self._database.execute(query, params)

if settings.DEBUG:
time_elapsed = timer.elapsed()
log(
f"Executed SQL query: {query} {params} in {time_elapsed * 1000:.2f} msec.",
extra={
"query": query,
"params": params,
"time_elapsed": time_elapsed,
},
)

return cast(int, rec_id)

# NOTE: this accepts str since current execute_many uses are not using alchemy.
# alchemy does execute_many in a single query so this method will be unneeded once raw SQL is not in use.
async def execute_many(self, query: str, params: list[MySQLParams]) -> None:
if isinstance(query, ClauseElement):
query, _ = self._compile(query)

with Timer() as timer:
await self._database.execute_many(query, params)

if settings.DEBUG:
time_elapsed = timer.elapsed()
log(
f"Executed SQL query: {query} {params} in {time_elapsed * 1000:.2f} msec.",
extra={
"query": query,
"params": params,
"time_elapsed": time_elapsed,
},
)

def transaction(
self,
*,
force_rollback: bool = False,
**kwargs: Any,
) -> Transaction:
return self._database.transaction(force_rollback=force_rollback, **kwargs)
Loading

0 comments on commit 4b87653

Please sign in to comment.