Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ellama attempts to kill current buffer when there is a backend error #171

Open
sunng87 opened this issue Nov 4, 2024 · 7 comments
Open
Labels
bug Something isn't working

Comments

@sunng87
Copy link

sunng87 commented Nov 4, 2024

Steps to reproduce:

  1. Configure ollama to an unreachable host
  2. Start ellama-ask-line and input the question

Emacs will kill current buffer or prompt for save if it's dirty.

@tusharhero
Copy link

Can confirm this, is this the intended behaviour @s-kostyaev ?

@s-kostyaev
Copy link
Owner

No, it's a bug. I will try to fix it soon.

@s-kostyaev s-kostyaev added the bug Something isn't working label Feb 21, 2025
@s-kostyaev
Copy link
Owner

@ahyatt looks like it's a bug in an llm library. Steps to reproduce:

(llm-chat-streaming (make-llm-ollama :chat-model "llama3" :host "unreacheable")
		    (llm-make-chat-prompt "test") #'ignore #'ignore #'ignore)

@ahyatt
Copy link

ahyatt commented Feb 22, 2025

Interesting find. @r0man, can you take a look at this? After some initial investigation, plz--kill-buffer is killing the wrong buffer. I think you should have a better idea of why this is happening than I do. It probably does not have to do with streaming, I can also reproduce with llm-chat-async.

@r0man
Copy link
Contributor

r0man commented Feb 23, 2025

Hi @s-kostyaev and @ahyatt,

this is a know issue, that @leotaku reported a while ago. He also contributed a fix:
r0man/plz-media-type#14

I just tried the snippet @s-kostyaev provided and I think it solves the issue. Since I believe the PR counts as a non-trivial contribution we are waiting for Leo's paperwork to complete. He told be that he did all of it and the ball is at the FSF right now. So I expect the paperwork to be complete in the next weeks or so.

Is that good enough for you? Should we do something else?

Roman

@s-kostyaev
Copy link
Owner

Thank you @r0man

I think we need to wait.

@r0man
Copy link
Contributor

r0man commented Feb 28, 2025

Hi @s-kostyaev and @ahyatt ,

this should be fixed now. I cut a release which should be on ELPA soonish.
https://github.com/r0man/plz-media-type/releases/tag/v0.2.3

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

5 participants