Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[please test] BYOK with ollama #342

Open
olegklimov opened this issue Oct 2, 2024 · 3 comments
Open

[please test] BYOK with ollama #342

olegklimov opened this issue Oct 2, 2024 · 3 comments

Comments

@olegklimov
Copy link
Contributor

With the ollama project it's easy to host our own AI models.

You can set up bring-your-own-key (BYOK) to connect to ollama server, and see if you can use StarCoder2 for code completion, llama models for chat.

Does it work at all? What we need to fix to make it better?

@pardeep-singh
Copy link

@olegklimov I would like to take this up. Can you please some docs/example of how this can done? Do we need to test the integration here or make changes as well to make it work?

@olegklimov
Copy link
Contributor Author

Oh, here https://docs.refact.ai/byok/ you can test if we have documentation that is any good :D

@avie66
Copy link
Contributor

avie66 commented Oct 21, 2024

Hi @pardeep-singh
Did you had a look over Oleg's approach?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants