Is it possible to separate Models by Team? #5563
-
Hey, I'm trying LiteLLM out, and so far what it's doing is brilliant! Currently using the Proxy Server. Currently we're able to create Virtual Keys by selected team. I'm asking that because I have multiple teams, but each will have his own OpenAI API key, so technically, in the current approach, I would have model for every team based on for example gpt-4. Technically, it would be awesome if there's a way to separate Models as we separate Virtual Keys, by selected team.
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
yes - use this https://docs.litellm.ai/docs/proxy/team_based_routing |
Beta Was this translation helpful? Give feedback.
yes - use this https://docs.litellm.ai/docs/proxy/team_based_routing