-
Notifications
You must be signed in to change notification settings - Fork 101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Overhaul how chat memory is configured #112
Conversation
3f2f5ea
to
52a0bc1
Compare
* @param aiService The bean that implements the AI Service annotated with {@link RegisterAiService} | ||
* @param memoryId The object used as memory IDs for which the corresponding {@link ChatMemory} should be removed | ||
*/ | ||
public static void remove(Object aiService, Object memoryId) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't like using two objects here as users could easily swap them, but I don't have a better way of handling it...
context (no pun intended) For #47 :) |
* It uses whatever bean `dev.langchain4j.store.memory.chat.ChatMemoryStore` bean is configured, as the backing store. The default implementation is `dev.langchain4j.store.memory.chat.InMemoryChatMemoryStore` | ||
** If the application provides its own `ChatMemoryStore` bean, that will be used instead of the default `InMemoryChatMemoryStore`, | ||
* It leverages the available configuration options under `quarkus.langchain4j.chat-memory` to construct the `ChatMemoryProvider`. | ||
** The default configuration values result in the usage of `dev.langchain4j.memory.chat.MessageWindowChatMemory` with a window size of ten |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is there a way to make it unbounded window?
if user wanted to keep the window but instead passivate/persist memory that have not been accessed for a while is that where ChatMemoryRemover come into play?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is there a way to make it unbounded window?
Nope, Langchain4j doesn't have that - you would need to write you own
if user wanted to keep the window but instead passivate/persist memory that have not been accessed for a while is that where ChatMemoryRemover come into play?
Nope, see the updated samples that use ChatMemoryRemover
to see when it's meant to be used
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
okey so maybe i'm appraoching this wrong ..but do I grok this right as its purely just a way to demarcate when okey to disappear? like - exactly what a scope begin/end does in CDI?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
btw. no objections to it as it makes sense to have something like this available - just trying to imagine how this would look if we actually had some clear CDI scopes available for websocket et.al.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When we have clear scopes (or when the existing request scope is used), everything works magically 😄.
I mean that the extension already has the logic to remove chat messages from the store
I'd like @andreas-eberle to also have a look since he raised the concern about |
Once this is in, I'll do a |
return MessageWindowChatMemory.builder() | ||
.maxMessages(20) | ||
.id(memoryId) | ||
.chatMemoryStore(new InMemoryChatMemoryStore()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does it make sense to create a new store here every time?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch! I fixed it
Looks good to me. Thanks @geoand ! |
The basic ideas are now: * A default ChatMemoryStore bean using in memory storage is now provided ** This can be overridden by users simply by providing their own bean * A default ChatMemoryProvider bean is now provided ** It uses the ChatMemoryStore bean under the hood ** It also leverages configuration for use either MessageWindowChatMemory or TokenWindowChatMemory ChatMemory objects continue to be removed when the AI service class goes out of scope, however now there is also a programmatic way of removing ChatMemory by using ChatMemoryRemover. This is necessary for when the AI service is not request scoped.
52a0bc1
to
1e27a97
Compare
The basic ideas are:
ChatMemoryStore
bean using in memory storage is now providedChatMemoryProvider
bean is now providedChatMemoryStore
bean under the hoodMessageWindowChatMemory
orTokenWindowChatMemory
ChatMemory
objects continue to be removed when the AI service class goes out of scope, however now there is also now a programmatic way of removingChatMemory
by usingChatMemoryRemover
. This is necessary for when the AI service is not request scoped.