Can litelllm work with model serving with Triton inference server #3498
Unanswered
YunchaoYang
asked this question in
Q&A
Replies: 2 comments 2 replies
-
Hey @YunchaoYang i think i've seen this request before - do you have docs for the api? and is there an easy way to test it |
Beta Was this translation helpful? Give feedback.
0 replies
-
@YunchaoYang we support Triton Embeddings: https://docs.litellm.ai/docs/providers/triton-inference-server Is this what you needed ? |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi All,
just wondering that is there a way for LittLM to infer with Triton Inference Server using the KServe protocol?
Beta Was this translation helpful? Give feedback.
All reactions