Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Ollama Keep Alive "-1" not working #3576

Open
sebaxakerhtc opened this issue Nov 26, 2024 · 1 comment
Open

[BUG] Ollama Keep Alive "-1" not working #3576

sebaxakerhtc opened this issue Nov 26, 2024 · 1 comment

Comments

@sebaxakerhtc
Copy link

Describe the bug
A clear and concise description of what the bug is.
Ollama's keep alive setting supports "-1" value, so keep model forever.
Flowise's ChatOllama doesn't support this value and set the default 4 minutes.

To Reproduce
Steps to reproduce the behavior:

  1. Add 'ChatOllama' to workflow
  2. Click on 'ChatOllama's additionals settings'
  3. Set the value "-1" for 'Keep Alive'
  4. See "4 minutes from now" with 'ollama ps' command after model loaded

Expected behavior
Expected to see "forever from now" with 'ollama ps' command after model loaded

Screenshots
изображение

Flow
Not applicable.

Setup

  • Installation [docker`]
  • Flowise Version [2.1.5]
  • OS: [Windows]
  • Browser [any]

Additional context
Add any other context about the problem here.

@sebaxakerhtc
Copy link
Author

sebaxakerhtc commented Nov 27, 2024

A half-related discussion, where we can understand why it's important to have a single model forever in VRAM. Just my use-case

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant