You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It seems that after a while I start to get the same response over and over again, even if I ask use another prompt. It happened with 3 different chats while I was testing using the some model.
Settings:
Linux (ArchLinux), zsh
elia installed via pipx
Using ollama
Model phi3
In the modelfile, it has "num_keep=4" which means it should keep the last 4 messages in context.
I will try with other models later. Please let me know if you need more info.
Edit: I have just done some tests with another model that does not have "num_keep" in its modelfile (stablelm2:zephyr) and it seems that it has the same problem: after a while it keeps repeating info about previous inferences.
Thanks again. Regards
The text was updated successfully, but these errors were encountered:
Hi, there!
It seems that after a while I start to get the same response over and over again, even if I ask use another prompt. It happened with 3 different chats while I was testing using the some model.
Settings:
elia
installed viapipx
I will try with other models later. Please let me know if you need more info.
Edit: I have just done some tests with another model that does not have "num_keep" in its modelfile (stablelm2:zephyr) and it seems that it has the same problem: after a while it keeps repeating info about previous inferences.
Thanks again. Regards
The text was updated successfully, but these errors were encountered: