Proxy Mode Issues with Together.ai as the Backend Service #2027
Unanswered
vitalii-lobanov
asked this question in
Q&A
Replies: 2 comments
-
Hi @vitalii-lobanov your settings look good to me, can we hop on a quick call and debug this together ? Sharing a link to my cal for your convenience: |
Beta Was this translation helpful? Give feedback.
0 replies
-
I've identified and corrected an error in my configuration. Initially, I had set the API base URL for Together.ai to https://api.together.xyz/inference, which was incorrect. After updating it to https://api.together.xyz/v1, everything is now functioning properly. Here is my updated configuration:
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm encountering an error when attempting to use Litellm in proxy mode, with Together.ai serving as the backend service. Here is the curl query:
The response is the following:
Litellm's logs:
Below is the complete Litellm test configuration that I utilized for this experiment:
Litellm version:
I am able to access Together.ai's API from the same machine where Litellm is running. Here is an example of the query:
The response is:
Additionally, I've verified Litellm's functionality by testing it with the Openrouter API, which worked correctly. Here is the query used for testing:
The response is correct:
Finally, I will provide the complete Litellm logs corresponding to the error encountered when Litellm is used as a proxy for Together.ai's API:
The question arises as to what might have been configured incorrectly, and how can I correctly set up Litellm to serve as a proxy for Together.ai's service?
Beta Was this translation helpful? Give feedback.
All reactions