LiteLLM Proxy Configuration File Generator #3319
CXwudi
started this conversation in
Show and tell
Replies: 1 comment 2 replies
-
nice - how do you use this w/ litellm? @CXwudi a sense of the expected user flow would be great! |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi all,
First of all, many appreciations to all LiteLLM maintainers. LiteLLM has been amazing and helped me to collect up to 100+ models with the same OpenAI API formats.
However, hand writing the configuration file for all 100+ models is unrealistic. So I developed a small program to automatically generate the configuration file. Now it is open-sourced at https://github.com/CXwudi/litellm-config-generator
The program only requires some configurations and an uncompleted LiteLLM proxy configuration file (simply has the
model_list
removed). When run the program, it can read the latest list of models from each provider, and output the completed LiteLLM proxy configuration file with themodel_list
filled. You can check more details in the README file.So far the program support OpenAI, Gemini, Anthropic, Mistral, Groq, GitHub Copilot, TogetherAI, and OpenRouter.
Beta Was this translation helpful? Give feedback.
All reactions