-
Notifications
You must be signed in to change notification settings - Fork 88
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow the AiService to inject AiMessage into the prompt #905
Comments
For example if I'm using
|
Hm, again I wonder if this is something that would be handled by |
No, at least I don't think so from what I know about The |
Fair enough. Question: wouldn't having |
I see your point... in this case the |
What I'm tryng to say is, in the example above the result of the memory after a call should be something like:
and not:
|
I see. @langchain4j is this something that would make sense for the project? |
I'm thinking more about the scenarios where I could use the I understand that for providers using a "chat" API this can be a bit weird. In any case, I know how to go on without adding this feature, from my side it is also possible to close this issue. :) |
Thanks for the update. Let's see what @langchain4j thinks |
Hi @andreadimaio, could you provide a (real world) example when this can be useful? There is a related request for Anthropic, but there we agreed to handle it on |
We can use the first message as example. Let's say I'm using So suppose the request to get a best answer has to be formatted like this:
Where:
Today, it is not possible to use the annotations in the @RegisterAiService
public interface AiService {
@SystemMessage("You are a helpful assistant")
@UserMessage("Input: {question}\nOutput:")
public String answer(String question);
} the result will be:
As you can see in this case, the "Output:" prefix is in the |
From my point of view, the |
@andreadimaio sorry for confusion, but I am still not sure what is the purpose of "Output:" here? |
This is just an example, but yes the ability to add a prefix gives the developer more flexibility when writing a prompt, and as you said, some models could be fine-tuned to respond better with a prefix. |
@andreadimaio sure, let's go with |
No, personally I don't have any real-world scenario, as I wrote in some message above, my use cases can do without the |
Might be useful to have an annotation that allows the developer to use a template for the
AiMessage
.Example:
This is particularly useful if you are using models that use tags.
I don't know if this is something that might come from langchain4j.
The text was updated successfully, but these errors were encountered: