We've built a simple console application that demonstrates how LaunchDarkly's SDK works.
Below, you'll find the build procedure. For more comprehensive instructions, you can visit your Quickstart page or the Python reference guide.
This demo requires Python 3.8 or higher.
This repository includes examples for OpenAI
, Bedrock
, and LangChain
for multi-provider support. Depending on your preferred provider, you may have to take some additional steps.
-
Set the environment variable
LAUNCHDARKLY_SDK_KEY
to your LaunchDarkly SDK key. If there is an existing an AI Config in your LaunchDarkly project that you want to evaluate, setLAUNCHDARKLY_AI_CONFIG_KEY
to the flag key; otherwise, an AI Config ofsample-ai-config
will be assumed.export LAUNCHDARKLY_SDK_KEY="1234567890abcdef" export LAUNCHDARKLY_AI_CONFIG_KEY="sample-ai-config"
-
Ensure you have Poetry installed.
- Install the required dependencies with
poetry install -E openai
orpoetry install --all-extras
. - Set the environment variable
OPENAI_API_KEY
to your OpenAI key. - On the command line, run
poetry run openai-example
.
- Install the required dependencies with
poetry install -E bedrock
orpoetry install --all-extras
. - Ensure the required AWS credentials can be auto-detected by the
boto3
library. Examples might include environment variables, role providers, or shared credential files. - On the command line, run
poetry run bedrock-example
.
This example uses OpenAI
, Bedrock
, and Gemini
LangChain provider packages. You can add additional LangChain providers using the poetry add
command.
- Install all dependencies with
poetry install -E langchain
orpoetry install --all-extras
. - Set up API keys for the providers you want to use
- On the command line, run
poetry run langchain-example