Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[suggestion] Ask for additional buffer context via user input #19

Open
MrGrinst opened this issue Jun 25, 2024 · 14 comments
Open

[suggestion] Ask for additional buffer context via user input #19

MrGrinst opened this issue Jun 25, 2024 · 14 comments
Labels
enhancement New feature or request

Comments

@MrGrinst
Copy link
Contributor

There are cases where it would be nice for the LLM to have the context of another file I have open. I don't have a good idea yet on the implementation specifics but I'm wondering if you would be open to a PR that prompts the user for additional buffers to include as context for the LLM.

@frankroeder
Copy link
Owner

I am open to have something experimental on a different branch, guessing it would take a little bit more time to test this out and search for the best solution.

@frankroeder frankroeder added the enhancement New feature or request label Jun 25, 2024
@eterps
Copy link
Contributor

eterps commented Jun 25, 2024

you would be open to a PR that prompts the user for additional buffers to include as context for the LLM

You might want to check aider for inspiration which does exactly that.
Unfortunately it's a separate TUI app, not a neovim plugin (which is the reason I'm not using it more often). Still a lot can be learned from it.

@MrGrinst
Copy link
Contributor Author

@frankroeder @eterps what do you think about having a template variable like {{userinput_buffers}}. When that var is detected in the template, it would use fzf-lua to ask the user to select from open buffers.

@frankroeder
Copy link
Owner

@MrGrinst this is what I already have on my local branch. Currently, just don't have the time to finish it.

@frankroeder
Copy link
Owner

@MrGrinst, how is it going with this feature? I guess the async stuff of fzf-lua is not so easy to handle.

@MrGrinst
Copy link
Contributor Author

Yeah I haven't had time to finish it for that exact reason. The async stuff is beyond my Lua skills.

@eterps
Copy link
Contributor

eterps commented Jul 28, 2024

Last week I played around with this nvim integration for aider.

My first impression is that aider doesn't integrate very well in nvim, but I liked the feature of this plugin to automatically add all open buffers as a context.

It really makes sense as long as you have a set of files opened in buffers for the task at hand (otherwise you would need to delete some buffers first).

@frankroeder
Copy link
Owner

I have added the feature to include the content of all open buffers. Feel free to check out the branch for pull request #37.

Just to be clear, tools like Aider and Devin aim to "replace" the programmer's work. We focus on supporting the programmer through LLMs in the first place.

@eterps
Copy link
Contributor

eterps commented Jul 31, 2024

I have added the feature to include the content of all open buffers.

Oooh nice, that looks incredibly useful. Definitely gonna check it out 👍

We focus on supporting the programmer through LLMs

Yeah, I agree, no need to turn this into Aider/Devin. The way parrot.nvim's templates work offer more fine grained control anyway.

@eterps
Copy link
Contributor

eterps commented Aug 1, 2024

@frankroeder , it works great. Very useful feature.

I noticed that I had to change the last part of the prompt to:

Please finish the code above carefully and logically.
Respond with the snippet of code that should be inserted, but absolutely no explanations or code block markers.

Otherwise Claude 3.5 would keep adding explanations or code blocks.

I'm wondering if this would be nice to have this for chats as well (instead of just for commands).

@frankroeder
Copy link
Owner

Great, I'm glad to hear that!
Sure, I guess each model needs system prompt fine-tuning at some point, and it is not easy to provide general prompts that work equally well.

I believe this is what we already have with #27 for chats and the general templates in the config that can be overridden for the commands. If I've misunderstood you, could you provide more details on what exactly you want to change and under which circumstances?

@MrGrinst
Copy link
Contributor Author

This is great, I like this approach! Thanks @frankroeder. I'm thinking of adding a check to skip files larger than a certain size. Wanna avoid expensive calls (already had one go through that was 100k tokens 😮). Would you be open to that?

@MrGrinst
Copy link
Contributor Author

Well it ended up being pretty hacky and probably making some assumptions based on my usage that might not be true for others. Not gonna open a PR but for reference here's the approach: 2b644d8

@frankroeder
Copy link
Owner

@MrGrinst, that is a very good suggestion. I am considering adding an optional confirmation for a specific approximate token limit. Thank you very much for your idea and implementation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants