-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ellama attempts to kill current buffer when there is a backend error #171
Comments
Can confirm this, is this the intended behaviour @s-kostyaev ? |
No, it's a bug. I will try to fix it soon. |
@ahyatt looks like it's a bug in an (llm-chat-streaming (make-llm-ollama :chat-model "llama3" :host "unreacheable")
(llm-make-chat-prompt "test") #'ignore #'ignore #'ignore) |
Interesting find. @r0man, can you take a look at this? After some initial investigation, |
Hi @s-kostyaev and @ahyatt, this is a know issue, that @leotaku reported a while ago. He also contributed a fix: I just tried the snippet @s-kostyaev provided and I think it solves the issue. Since I believe the PR counts as a non-trivial contribution we are waiting for Leo's paperwork to complete. He told be that he did all of it and the ball is at the FSF right now. So I expect the paperwork to be complete in the next weeks or so. Is that good enough for you? Should we do something else? Roman |
Thank you @r0man I think we need to wait. |
Hi @s-kostyaev and @ahyatt , this should be fixed now. I cut a release which should be on ELPA soonish. |
Steps to reproduce:
Emacs will kill current buffer or prompt for save if it's dirty.
The text was updated successfully, but these errors were encountered: