You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Litellm is pretty great open source project to connect with 100+ llm's easily without having to face to much trouble and it can help in connecting with ollama models too so it will great if it supports litellm by default.
Native support for ollama models will be much appreciated by everyone you know the reason for it ;-))
Btw great project so thankyou very much for developing and maintaining it.
The text was updated successfully, but these errors were encountered:
Hey @Greatz08! We're working on this. We want to make sure we have benchmark performance metrics on standard LLMs like Claude Sonnet + 4o before expanding to support more models. It's not hard to support more models, but it greatly increases eval complexity while we're still building out new features.
Litellm is pretty great open source project to connect with 100+ llm's easily without having to face to much trouble and it can help in connecting with ollama models too so it will great if it supports litellm by default.
Native support for ollama models will be much appreciated by everyone you know the reason for it ;-))
Btw great project so thankyou very much for developing and maintaining it.
The text was updated successfully, but these errors were encountered: