You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe how to reproduce
I just starting using this awesome plugin. However, when I select the command, Copilot: Edit custom prompt it just says, "No results found".
Expected behavior
I'm not sure if this is officially supported or not. I'm expecting the ability to specify a user prompt template, something like this.
{context}
Please read the user's question supplied within the tags. Then, using only the contextual information provided above within the tags, generate an answer to the question.
{question}
I'm finding that in Long Note QA mode, the LLM is using a lot of its own knowledge in its responses. I'm wanting to constrain it more to using just the information contained in the note.
Screenshots
The text was updated successfully, but these errors were encountered:
Describe how to reproduce
I just starting using this awesome plugin. However, when I select the command,
Copilot: Edit custom prompt
it just says, "No results found".Expected behavior
I'm not sure if this is officially supported or not. I'm expecting the ability to specify a user prompt template, something like this.
I'm finding that in
Long Note QA
mode, the LLM is using a lot of its own knowledge in its responses. I'm wanting to constrain it more to using just the information contained in the note.Screenshots
The text was updated successfully, but these errors were encountered: