This repository contains two Python projects demonstrating AI observability with LaunchDarkly:
- AI Chat: A simple CLI chat interface using OpenAI
- LangChain Tools: A CLI tool with multiple AI capabilities using LangChain
- Python 3.10 or higher
- Poetry for dependency management
- OpenAI API key
- LaunchDarkly SDK key (optional)
A straightforward CLI chat interface with OpenAI's GPT models.
# Navigate to the project
cd ai_chat
# Install dependencies
poetry install
# Set up environment variables
cp .env.example .env
# Edit .env with your API keys
# Development mode with hot reload
poetry run watch
# Regular mode
poetry run dev
# CLI options
poetry run dev --help
Available commands in chat:
- Type your message and press Enter
clear
: Clear conversation historyhelp
: Show help messageexit
,quit
, orbye
: End session
A more advanced CLI tool with multiple AI capabilities using LangChain.
# Navigate to the project
cd langchain_tools
# Install dependencies
poetry install
# Set up environment variables
cp .env.example .env
# Edit .env with your API keys
# Required
OPENAI_API_KEY=your_openai_api_key
# Optional
LAUNCHDARKLY_SDK_KEY=your_launchdarkly_sdk_key
OTEL_SERVICE_NAME=langchain-tools
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318/v1/traces
OTEL_DEBUG=true # For development
# Development mode with hot reload
poetry run watch
# Regular mode
poetry run dev
# CLI options
poetry run dev --help
Available commands:
- Type your question and press Enter
help
: Show help messageexit
,quit
, orbye
: End session
Both projects support LaunchDarkly for:
- Dynamic AI configuration
- Observability and metrics
- A/B testing of prompts and models
To enable LaunchDarkly features:
- Set
LAUNCHDARKLY_SDK_KEY
in your .env file - Create corresponding flags in your LaunchDarkly project:
- For AI Chat:
ai-observability-python-chat
- For LangChain Tools:
langchain-agent-config
- For AI Chat:
Both projects include OpenTelemetry instrumentation:
- AI Chat: Traces OpenAI API calls
- LangChain Tools: Traces both OpenAI and LangChain operations
To view traces:
- Set up an OpenTelemetry collector
- Configure
OTEL_EXPORTER_OTLP_ENDPOINT
- Enable debug mode with
OTEL_DEBUG=true
for console output
Both projects support hot reload during development:
poetry run watch # In either project directory
This will automatically restart the application when code changes are detected.