This demo showcases a Flask-based AI assistant that connects to a local PostgreSQL running in Docker via a fastmcp MCP server over SSE. The assistant can list tables, generate SQL from natural language, execute it, and return results.
- Docker Compose: PostgreSQL
scripts/setup.sh: initializes DB with sample schema/data (viascripts/sample.sql)postgres-mcp-fastmcp.py: fastmcp server exposinglist_tablesandrun_queryover SSEapp.py: Flask UI + OpenAI-powered SQL generation calling the MCP server
- Docker Desktop
- Python 3.10+
- An OpenAI API key
- Copy env file and edit values:
cp .env.example .env
chmod +x scripts/*.sh- Start Postgres and seed sample data (uses docker-compose):
bash scripts/setup.sh- Create venv and install deps:
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt- Run the MCP server (separate terminal):
python postgres-mcp-fastmcp.py- Run the Flask app:
python app.pyVisit http://localhost:5050.
Optional check via the fastmcp client:
python - <<'PY'
from fastmcp import Client
import asyncio
async def main():
async with Client('http://127.0.0.1:8000/sse') as c:
tools = await c.list_tools()
print('tools:', [t.name for t in tools])
r = await c.call_tool('list_tables', _return_raw_result=True)
print('list_tables raw:', r)
r2 = await c.call_tool('run_query', {'sql': 'select * from customers limit 1'}, _return_raw_result=True)
print('run_query raw:', r2)
asyncio.run(main())
PYSee .env.example. Sensitive values are loaded from .env.
MCP server respects MCP_HOST and MCP_PORT (defaults 127.0.0.1:8000). Flask app uses them to call the server. Flask serves on FLASK_PORT (default 5050).
For the OpenAI-compatible LLM, the app reads:
OPENAI_BASE_URL(default: http://192.168.1.39:8000/v1)OPENAI_API_KEY(default: redhat123)OPENAI_MODEL(default: gpt-4o-mini)
Override these in your .env if your local server uses different values.
- This demo is for local exploration only. Do not expose it publicly.
- Error handling is minimal for clarity. Improve before production use.
Note: The fastmcp server speaks SSE on
/sse. Traditional JSON-RPC curl calls to/are not supported.
here are solid demo prompts that exercise joins, grouping, filters, and windows without being destructive:
- "Total paid order amount per customer, show customer name and total, sorted desc."
- "Average order amount by status (paid, pending, refunded), highest first."
- "List customers with no orders."
- "Top 5 customers by total paid amount in the last 30 days."
- "Daily paid revenue for the last 7 days."
- "Pending orders older than 7 days (id, customer_id, amount, created_at)."
- "Refund amount and refund rate per customer (refunded/total)."
- "Email domain breakdown of customers with counts (e.g., example.com)."
- "First order date per customer alongside customer name."
- "Running total of paid amount per customer over time (customer, date, running_total)."
- "Customers with lifetime paid spend greater than 200, sorted by spend."
- "Top 5 customers by average paid order value with at least 2 paid orders."
- "Monthly paid revenue for the current year (YYYY-MM and total)."
- "Paid vs refunded totals per customer in separate columns."
- "Show the column names and data types for public.customers and public.orders."