Replies: 6 comments 2 replies
-
probably makes sense to switch to something like https://github.com/themaximalist/llm.js or https://github.com/samestrin/llm-interface, since everyone is thinking that their LLM is better than the other anyway and we'll not add support for every single one of them. |
Beta Was this translation helpful? Give feedback.
-
Also, supporting this might be an additional solution. The AI Gateway streamlines requests to 250+ language, vision, audio and image models with a unified API. |
Beta Was this translation helpful? Give feedback.
-
Another reason to support Gemini is that it has a pretty generous free tier (free as in you pay with data instead of money) |
Beta Was this translation helpful? Give feedback.
-
did anyone try this: https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/call-vertex-using-openai-library Seems like gemini has openai compatible API. You just need to set the correct base url and it'll work. |
Beta Was this translation helpful? Give feedback.
-
I have problem with setting this up? Should it be: |
Beta Was this translation helpful? Give feedback.
-
If you self host a litellm proxy, you should be able to use whatever LLM you want. |
Beta Was this translation helpful? Give feedback.
-
Is there any chance there might be support for Gemini in the future? I have come to really like it and I already have my stuff in one dashboard, just wondering if there was a chance.
Beta Was this translation helpful? Give feedback.
All reactions