Ollama llava proxy_server_config.yaml and curl request #1864
-
Hello, I want to achieve to run ollama/llava via litellm. I think I am missing something. Can you show me how should I proceed? Thanks a lot. This curl request is working for ollama. I've tried this but litellm proxy gave an error for images. Also, tried with image_url, base_url etc.
This is the llava model on proxy_server_config.yaml
Non working curl request:
This is a code example but when I send this to the proxy it gives error.
The errors I got in different attempts are as follows: 1: 2: |
Beta Was this translation helpful? Give feedback.
Replies: 6 comments 3 replies
-
what's the error? @fatihyildizhan |
Beta Was this translation helpful? Give feedback.
-
I've couldn't find llava proxy server config. Maybe llava is different than text models. |
Beta Was this translation helpful? Give feedback.
-
Also this gives this error:
|
Beta Was this translation helpful? Give feedback.
-
Hello @krrishdholakia I'm sure that you're so busy. I would appreciate it if you could look into this issue when you have the opportunity. How can I send a curl request to ollama/llava via proxy? Thanks. |
Beta Was this translation helpful? Give feedback.
-
Here is the call I have made to LiteLLM API and is working since #2201 curl "http://127.0.0.1:8000/v1/chat/completions" \
-X POST \
-H "Content-Type: application/json" \
-d '{
"model": "ollama/llava",
"messages": [
{ "role": "user", "content": [
{ "type": "text", "text": "Whats in this image?" },
{ "type": "image_url", "image_url": { "url": "iVBORw...SuQmCC" } }
]}
]
}' Here is the response from LiteLLM {"id":"chatcmpl-1b971af2-77f5-47a8-b06e-7465ae01251a","choices":[{"finish_reason":"stop","index":0,"message":{"content":" The image you've provided is a playful and cute illustration of an animated character that looks like an anthropomorphic sheep. It appears to be waving goodbye or saying hello, with a happy expression on its face. This type of drawing is often found in cartoons and animations where animals are given human-like characteristics for comedic effect. The artwork is stylized, with exaggerated features that are typical of cartoon representations. ","role":"assistant"}}],"created":1709154971,"model":"ollama/llava","object":"chat.completion","system_fingerprint":null,"usage":{"prompt_tokens":1,"completion_tokens":95,"total_tokens":96}} Here is the Litellm proxy config used model_list:
- model_name: llava
litellm_params:
model: ollama/llava
api_base: "http://localhost:11434" Litellm was started localy with litellm --config /tmp/config.yaml --debug --detailed_debug |
Beta Was this translation helpful? Give feedback.
Here is the call I have made to LiteLLM API and is working since #2201
Here is the response from LiteLLM