Skip to content

Commit

Permalink
fix: missing code written since august
Browse files Browse the repository at this point in the history
  • Loading branch information
gaudinnicolas committed Nov 20, 2024
1 parent ec00cdf commit e94fc36
Show file tree
Hide file tree
Showing 15 changed files with 465 additions and 6 deletions.
65 changes: 65 additions & 0 deletions .d4g-tools/bin/.from-scratch-sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
#!/usr/bin/env bash
# shellcheck disable=SC1091
# shellcheck disable=SC1090
# shellcheck disable=SC2034

set -Eeuo pipefail

# IMPORTANT AND NECESSARY: Load dependencies
source "$LIB_DIR"/common.sh

usage() {
cat <<EOF
USAGE ${0} [-v] [-h]
This is a description of the script.
Honestly, write whatever you want.
Supported parameters :
-h, --help : display this message
-v, --verbose : enable enhanced logging
EOF
exit 1
}

parse_params() {
if [ $# -gt 2 ]; then
echo "Too many parameters provided"
usage
fi

# Sane defaults
DEBUG="false"
RUN_DIR=$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd -P)

while :; do
case "${1-}" in
-h | --help)
usage
;;
-v | --verbose)
DEBUG="true"
;;
--dummy-flag*)
DUMMY_FLAG="true"
;;
--dummy-param=*)
DUMMY_PARAM="${1#*=}"
;;
-?*)
echo "Unknown option: $1"
usage
;;
*)
break
;;
esac
shift
done

return 0
}

parse_params "$@"

echo -n "Ready to rumble."
1 change: 1 addition & 0 deletions .d4g-tools/bin/d4g.env
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
#TODO ?
69 changes: 69 additions & 0 deletions .d4g-tools/bin/docker.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
#!/usr/bin/env bash
# shellcheck disable=SC1091
# shellcheck disable=SC1090
# shellcheck disable=SC2034

set -Eeuo pipefail

# IMPORTANT AND NECESSARY: Load dependencies
source "$LIB_DIR"/common.sh

usage() {
cat <<EOF
USAGE ${0} [-v] [-h]
This is a description of the script.
Honestly, write whatever you want.
Supported parameters :
-h, --help : display this message
-v, --verbose : enable enhanced logging
EOF
exit 1
}

parse_params() {
if [ $# -gt 2 ]; then
echo "Too many parameters provided"
usage
fi

# Sane defaults
DEBUG="false"
RUN_DIR=$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd -P)

while :; do
case "${1-}" in
-h | --help)
usage
;;
-v | --verbose)
DEBUG="true"
;;
--dummy-flag*)
DUMMY_FLAG="true"
;;
--dummy-param=*)
DUMMY_PARAM="${1#*=}"
;;
-?*)
echo "Unknown option: $1"
usage
;;
*)
break
;;
esac
shift
done

return 0
}

parse_params "$@"

if ! command_exists "docker"; then
brew install docker
fi

echo -n "Ready to rumble."
54 changes: 54 additions & 0 deletions .d4g-tools/bin/env.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
#!/usr/bin/env bash
# shellcheck disable=SC1091
# shellcheck disable=SC1090
# shellcheck disable=SC2034

set -Eeuo pipefail

# IMPORTANT AND NECESSARY: Load dependencies
source "$LIB_DIR"/common.sh

# This script replaces all env vars defined in .env and creates .env-full

# Specify the path to the .env file
env_file=".env"

# Specify the path to the .dock-env file
env_clean_file=".env-clean"

# Specify the path to the .env-full file
env_full_file=".env-full"

# Remove the existing .env-full file if it exists
if [ -f "$env_full_file" ]; then
rm "$env_full_file"
fi

# Remove comments and blank lines from the original .env file and write to the new .dock-env file
sed '/^[[:blank:]]*#/d; /^[[:blank:]]*$/d' "$env_file" >"$env_clean_file"

#echo "New $env_clean_file file created without comments."

# Set the allexport option
set -o allexport

# Source the $env_clean_file file to load the environment variables
source "$env_clean_file"

# Unset the allexport option
set +o allexport

# Loop through the lines of the .env-clean file and write their values to the .env-full file
while IFS= read -r line; do
var_name=$(echo "$line" | cut -d= -f1)
var_value="${!var_name}"
echo "$var_name=$var_value" >>"$env_full_file"
done <"$env_clean_file"

echo "--- Content of .env-full ---"
cat "$env_full_file"

# Remove the existing $env_clean_file file if it exists
if [ -f "$env_clean_file" ]; then
rm "$env_clean_file"
fi
29 changes: 29 additions & 0 deletions .d4g-tools/bin/fastapi/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
from fastapi import FastAPI, Depends
from sqlmodel import Session, SQLModel, create_engine
from sqlalchemy.orm import Session as SQLAlchemySession
import requests

DATABASE_URL = "postgresql://user:password@localhost/database"

engine = create_engine(DATABASE_URL)

def get_db() -> SQLAlchemySession:
with Session(engine) as session:
yield session

app = FastAPI()

@app.on_event("startup")
def on_startup():
with Session(engine) as session:
SQLModel.metadata.create_all(engine)
# Fetch data from Gapminder API
response = requests.get("https://api.gapminder.org/iso_codes")
data = response.json()
for item in data:
country = Country(name=item["name"], iso_code=item["iso_code"], region_id=item["region_id"])
session.add(country)
session.commit()

country_route = BaseRoute(Country, get_db)
app.include_router(country_route.router, prefix="/countries", tags=["countries"])
160 changes: 160 additions & 0 deletions .d4g-tools/bin/fastapi/readme.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,160 @@
Sure, here's a step-by-step guide to build the Flask web application as per your requirements:

### Step 1: Scrape Data and Store Locally

First, you'll need to scrape data from `data.gouv.fr` and store it locally as a CSV file.

```python
import requests
import pandas as pd

# Make a GET request to the website
url = 'https://www.data.gouv.fr/path_to_csv_file'
r = requests.get(url)

# Save the content in a pandas DataFrame
df = pd.read_csv(r.content)

# Save the DataFrame to a CSV file
df.to_csv('./data/initial/data.csv', index=False)
```

### Step 2: Create SQLModel and Persist to PostgreSQL

Next, create a SQLModel based on the CSV headers and persist it to PostgreSQL.

```python
from sqlmodel import SQLModel, Session, create_engine, Field
from typing import Optional
import pandas as pd

# Load the CSV file
df = pd.read_csv('./data/initial/data.csv')

# Define your SQLModel dynamically based on the CSV headers
class Data(SQLModel, table=True):
id: Optional[int] = Field(default=None, primary_key=True)
for col in df.columns:
exec(f"{col} = Field(None)")

# Create an engine and a session
engine = create_engine("postgresql://user:password@db/dbname")
SQLModel.metadata.create_all(engine)
session = Session(engine)

# Persist the data to PostgreSQL
for index, row in df.iterrows():
data = Data(**row.to_dict())
session.add(data)

session.commit()
```

### Step 3: Build Flask Web App

Now, build a Flask web app to display the data using Flask-Table for enhanced table components.

```python
from flask import Flask, render_template
from flask_table import Table, Col
from sqlmodel import select

# Declare your table
class DataTable(Table):
id = Col('ID')
for col in df.columns:
exec(f"{col} = Col('{col}')")

app = Flask(__name__)

@app.route('/')
def index():
result = session.exec(select(Data)).all()
table = DataTable(result)
return render_template('index.html', table=table)

if __name__ == '__main__':
app.run(debug=True)
```

### Step 4: Dockerize the Application

Create a `Dockerfile` for the Flask application:

```Dockerfile
FROM python:3.8-slim

WORKDIR /app

COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt

COPY . .

ENTRYPOINT ["./entrypoint.sh"]
```

Create an `entrypoint.sh` script:

```sh
#!/bin/sh

# Wait for PostgreSQL to be ready
while ! nc -z db 5432; do
sleep 1
done

# Run the application
exec "$@"
```

Make sure to give execute permission to `entrypoint.sh`:

```sh
chmod +x entrypoint.sh
```

Create a `docker-compose.yml` file:

```yaml
version: '3'
services:
web:
build: .
command: gunicorn -w 4 -k uvicorn.workers.UvicornWorker main:app --bind 0.0.0.0:5000
ports:
- "5000:5000"
depends_on:
- db
db:
image: "postgres"
environment:
POSTGRES_USER: user
POSTGRES_PASSWORD: password
POSTGRES_DB: dbname
```
Create a `requirements.txt` file:

```
Flask
Flask-Table
pandas
requests
sqlmodel
psycopg2-binary
gunicorn
uvicorn
```
### Step 5: Build and Run the Application
Build and run the Docker containers:
```sh
docker compose up --build
```

This will build the Docker images and start the containers. You can access the Flask web app at `http://localhost:5000`.

Feel free to adjust the code to fit your specific needs and ensure that all dependencies are installed correctly.
2 changes: 1 addition & 1 deletion .d4g-tools/bin/python/pyproject.toml.dist
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "poetry.core.masonry.api"

[tool.poetry]
name = "$PROJECT_NAME"
version = "0.1.0"
version = "$PROJECT_VERSION"
description = "$PROJECT_DESCRIPTION"
authors = ["DataForGood", "$PROJECT_AUTHORS"]
license = " MIT"
Expand Down
Loading

0 comments on commit e94fc36

Please sign in to comment.