[Feature]: "local/" model prefix to use LOCAL_API_URL and LOCAL_API_KEY env variables #2352
Replies: 2 comments
-
hey @jakobdylanc explain the difference between this and how it works today?
|
Beta Was this translation helpful? Give feedback.
-
Hey @krrishdholakia, I'm confused with what you're showing me here. Can you elaborate? In general, a big convenience LiteLLM provides is the ability to automatically pull in certain environment variables depending on the model name prefix. For example What I'm proposing is an extension of this functionality to make using local API servers much easier, by automatically pulling in LOCAL_API_URL and LOCAL_API_KEY when you use the |
Beta Was this translation helpful? Give feedback.
-
The Feature
Currently users have to manually override "base_url" in their code when using a local API server. LiteLLM should support "local/" in the beginning of the model name which would pull in LOCAL_API_URL and LOCAL_API_KEY environment variables.
For example, users of oobabooga/text-generation-webui could set the model name to "local/openai/model" which indicates a locally running OpenAI compatible API server. Then they'd just have to set the following environment variables:
(LOCAL_API_KEY should be optional since local API servers usually don't need an API key. But it definitely has use cases.)
Edge case note: Some local API servers (like LM Studio) will throw an error if "api_key" is a blank string. So LiteLLM should account for this, e.g. by setting "api_key" to "Not used" if the user doesn't provide a valid LOCAL_API_KEY.
Motivation, pitch
This is the simplest and most general solution for all local API server use cases.
Twitter / LinkedIn details
No response
Beta Was this translation helpful? Give feedback.
All reactions