Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama/ Local LLM Support #437

Open
tomtom215 opened this issue Dec 2, 2024 · 1 comment
Open

Ollama/ Local LLM Support #437

tomtom215 opened this issue Dec 2, 2024 · 1 comment

Comments

@tomtom215
Copy link

Hello,

This is an awesome project, thank you for sharing!

I was thinking that a lot more people (and potential contributors) would try this project out if they could use a local LLM api server like Ollama. It would also help with knowledge graph use-cases with sensitive data requirements as well as quick, local experimentation.

For example, here is how their embedding api endpoint looks https://ollama.com/blog/embedding-models

@634750802
Copy link
Collaborator

Hi @tomtom215, thanks for your feedback

We already support Ollama on main branch, please wait for our next docker release #419

Image
Image

@sykp241095 sykp241095 added this to the Pre-Framework-ify milestone Dec 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants