Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for ollama and litellm needed #338

Closed
Greatz08 opened this issue Dec 26, 2024 · 1 comment
Closed

Support for ollama and litellm needed #338

Greatz08 opened this issue Dec 26, 2024 · 1 comment

Comments

@Greatz08
Copy link

Litellm is pretty great open source project to connect with 100+ llm's easily without having to face to much trouble and it can help in connecting with ollama models too so it will great if it supports litellm by default.

Native support for ollama models will be much appreciated by everyone you know the reason for it ;-))
Btw great project so thankyou very much for developing and maintaining it.

@kamath
Copy link
Contributor

kamath commented Dec 28, 2024

Hey @Greatz08! We're working on this. We want to make sure we have benchmark performance metrics on standard LLMs like Claude Sonnet + 4o before expanding to support more models. It's not hard to support more models, but it greatly increases eval complexity while we're still building out new features.

@kamath kamath closed this as completed Dec 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants