Analyze GitHub repositories and generate insights using AI.
- Setup Environment
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
- Start Services (in this exact order)
# 1. Start infrastructure services first
docker-compose up
uvicorn backend.api.app:app --reload --log-level=info
npm run dev
- Initialize Repository
curl -X POST http://localhost:8000/repos/init \
-H "Content-Type: application/json" \
-d '{"owner": "openai", "repo": "tiktoken"}'
curl -X POST http://localhost:8000/repos/init \
-H "Content-Type: application/json" \
-d '{"owner": "Tialo", "repo": "githubXplainer"}'
# init
curl -X POST http://localhost:8000/elasticsearch/init
# drop
curl -X POST http://localhost:8000/elasticsearch/clear
curl -X POST http://localhost:8000/search/faiss \
-H "Content-Type: application/json" \
-d '{"query": "which commits simplified code?", "owner": "openai", "name": "tiktoken"}'
curl -X DELETE http://localhost:8000/repos/delete \
-H "Content-Type: application/json" \
-d '{"owner": "Tialo", "repo": "githubXplainer"}'
# Drop volumes
docker-compose down -v
# Format code
poetry run black backend/
poetry run isort backend/
# Run tests
poetry run pytest
- Database issues: Check
docker-compose ps
and database credentials - GitHub API: Verify token in
.env
and rate limits - Service errors: Check
logs/app.log
for details
Browse OpenAPI docs at http://localhost:8000/docs