How to call Hugging face TGI end point using liteLLM ? #2214
Replies: 3 comments
-
i tried solution given in https://docs.litellm.ai/docs/providers/custom
Getting error below
|
Beta Was this translation helpful? Give feedback.
0 replies
-
I could make it work
Here is the output ;
|
Beta Was this translation helpful? Give feedback.
0 replies
-
Use the Router for load balancing - https://docs.litellm.ai/docs/routing |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I have been using Hugging face TGI end pointsas below. Now i am thinking to use use LiteLLM for load balancing between 3 TGI points.
Is there any example available to understand how to use TGI end points with LiteLLM for Generate call back of TGI server end points ?
Is there any example available to understand how to use TGI end points with LiteLLM for load balacning ?
Beta Was this translation helpful? Give feedback.
All reactions