From cc7b9c63db464a9b94f901c172d15c8a3bbfd92f Mon Sep 17 00:00:00 2001 From: Massimiliano Pippi Date: Fri, 12 Apr 2024 19:18:57 +0200 Subject: [PATCH] mention LiteLLM --- README.md | 3 +++ 1 file changed, 3 insertions(+) diff --git a/README.md b/README.md index 4c758f7..92d7139 100644 --- a/README.md +++ b/README.md @@ -189,6 +189,9 @@ If you paste Banks' output into ChatGPT you would get something like this: Climate change is a pressing global issue, but together we can create positive change! Let's embrace renewable energy, protect our planet, and build a sustainable future for generations to come. 🌍💚 #ClimateAction #PositiveFuture ``` +The `generate` extension uses [LiteLLM](https://github.com/BerriAI/litellm) under the hood, and provided you have the +proper environment variables set, you can use any model from the supported [model providers](https://docs.litellm.ai/docs/providers). + ### Go meta: create a prompt and `generate` its response We can leverage Jinja's macro system to generate a prompt, send the result to OpenAI and get a response.