Parameters in swagger docs #4734
-
Hi all! My team is researching running a litellm server and is working on some internal documentation for it. We've noticed that when fastapi auto-generates the openapi spec for litellm-added endpoints (via inclusion of the router from litellm/proxy/proxy_server.py), only some of the parameters of the endpoints actually show up, e.g. the model parameter in /chat/completions: Is there a full openapi spec available for the litellm openai proxy that we could directly inject instead? If so, are there tracked versions of the spec? From what we understand the spec should be effectively a superset of the openai openapi spec but we haven't found any documentation of a version it would be tied to. Thanks! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
hi @StephenKThung thanks for looking at LiteLLM
Not as yet but i'm happy to work with your team on this @StephenKThung could we hop on a call to understand what your team needs ? Link to my cal for your convenience here: https://calendly.com/d/4mp-gd3-k5k/berriai-1-1-onboarding-litellm-hosted-version?month=2024-07 If it's easier my linkedin is here https://www.linkedin.com/in/reffajnaahsi/ |
Beta Was this translation helpful? Give feedback.
hi @StephenKThung thanks for looking at LiteLLM
Not as yet but i'm happy to work with your team on this
@StephenKThung could we hop on a call to understand what your team needs ? Link to my cal for your convenience here: https://calendly.com/d/4mp-gd3-k5k/berriai-1-1-onboarding-litellm-hosted-version?month=2024-07
If it's easier my linkedin is here https://www.linkedin.com/in/reffajnaahsi/