Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add support Ollama backend #1065

Merged
merged 1 commit into from
Jul 3, 2024
Merged

Conversation

yankay
Copy link
Contributor

@yankay yankay commented Apr 14, 2024

Closes #1064

📑 Description

The Ollama can make it easier for users to interact with K8SGPT.
Support Ollama backend with Ollama API.

Usage:

# ./bin/k8sgpt auth add -b ollama  -m llama3 -u http://localhost:11434
ollama added to the AI backend provider list
# ./bin/k8sgpt analyze --explain -b ollama 
AI Provider: ollama

0: Service haha/dao-2048()
- Error: Service has not ready endpoints, pods: [Pod/dao-2048-69696bf664-kd997], expected 1
Error: Service has not ready endpoints, pods: [Pod/dao-2048-69696bf664-kd997], expected 1.

Solution:
1. Check the pod's status using `kubectl get pod <pod_name> -o yaml`.
2. Verify if the container is running and its logs are showing any errors.
3. If the container is not running, try restarting it with `kubectl exec <pod_name> -- restart`.
4. If the issue persists, check the service's configuration to ensure it's correctly pointing to the pod's port.

Next TODO things:

✅ Checks

  • My pull request adheres to the code style of this project
  • My code requires changes to the documentation
  • I have updated the documentation as required
  • All the tests have passed

ℹ Additional Information

@yankay yankay requested review from a team as code owners April 14, 2024 06:19
@yankay yankay changed the title feat: add Ollama backend feat: add support Ollama backend Apr 14, 2024
@arbreezy
Copy link
Member

@yankay this looks similar with localai's backend which utilizes openai's API 🤔

@yankay
Copy link
Contributor Author

yankay commented Apr 16, 2024

@yankay this looks similar with localai's backend which utilizes openai's API 🤔

HI @arbreezy

Like OpenAI and AzureOpenAI, they are similar but different projects. https://hyscaler.com/insights/ollama-vs-localai-open-source-local-llm-apis/ .

So, it needs to be implemented as 2 Providers.

ref:
LocalAI : https://localai.io/
Ollama: https://github.com/ollama/ollama

How do you think about that :-)

@arbreezy
Copy link
Member

@yankay this looks similar with localai's backend which utilizes openai's API 🤔

HI @arbreezy

Like OpenAI and AzureOpenAI, they are similar but different projects. https://hyscaler.com/insights/ollama-vs-localai-open-source-local-llm-apis/ .

So, it needs to be implemented as 2 Providers.

ref: LocalAI : https://localai.io/ Ollama: https://github.com/ollama/ollama

How do you think about that :-)

Azure OpenAI is slightly different but I get your argument

I don't have a strong opinion on adding another file for Ollama identical with localai; Ideally we would have a generic 'local' backend which support the OpenAI's APIs

any thoughts on that @AlexsJones @matthisholleville ?

@yankay
Copy link
Contributor Author

yankay commented Apr 22, 2024

@yankay this looks similar with localai's backend which utilizes openai's API 🤔

HI @arbreezy
Like OpenAI and AzureOpenAI, they are similar but different projects. https://hyscaler.com/insights/ollama-vs-localai-open-source-local-llm-apis/ .
So, it needs to be implemented as 2 Providers.
ref: LocalAI : https://localai.io/ Ollama: https://github.com/ollama/ollama
How do you think about that :-)

Azure OpenAI is slightly different but I get your argument

I don't have a strong opinion on adding another file for Ollama identical with localai; Ideally we would have a generic 'local' backend which support the OpenAI's APIs

any thoughts on that @AlexsJones @matthisholleville ?

Thanks @arbreezy

Ollama has an official go client. https://github.com/ollama/ollama/blob/main/api/client.go
If maintainers agree, I can change the code to use it, :-)

@arbreezy
Copy link
Member

If maintainers agree, I can change the code to use it, :-)

@yankay I think this makes more sense, any thoughts on that @AlexsJones @matthisholleville ?

@AlexsJones
Copy link
Member

If maintainers agree, I can change the code to use it, :-)

@yankay I think this makes more sense, any thoughts on that @AlexsJones @matthisholleville ?

I agree, thanks

@yankay yankay force-pushed the ollama branch 2 times, most recently from 9a64da0 to 1d4ee29 Compare May 6, 2024 10:51
@yankay yankay changed the title feat: add support Ollama backend feat: add support Ollama backend & bump golang to 1.22 May 6, 2024
@yankay
Copy link
Contributor Author

yankay commented May 6, 2024

If maintainers agree, I can change the code to use it, :-)

@yankay I think this makes more sense, any thoughts on that @AlexsJones @matthisholleville ?

I agree, thanks

Thanks @AlexsJones @arbreezy

It has been changed to use the Ollama official go client. https://github.com/ollama/ollama/blob/main/api/client.go

Would you please help to review it? :-)

Copy link
Contributor

@JuHyung-Son JuHyung-Son left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

overall good!

just lint please

pkg/ai/iai.go Outdated Show resolved Hide resolved
@yankay yankay changed the title feat: add support Ollama backend & bump golang to 1.22 feat: add support Ollama backend Jul 2, 2024
Copy link
Contributor

@JuHyung-Son JuHyung-Son left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm!
thanks for adding ollama

@AlexsJones AlexsJones merged commit b35dbd9 into k8sgpt-ai:main Jul 3, 2024
9 checks passed
@yankay
Copy link
Contributor Author

yankay commented Jul 4, 2024

Thanks @AlexsJones @JuHyung-Son for the PR review :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

[Feature]: support Ollama backend
4 participants