You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, the code that was supposed to maintain context/state between runs is all commented out.
We need to bring back this feature, by figuring out how to serialize the context. That means, we need to serialize
the globals dict
the queues
etc.
In this case, I don't think it's enough to maintain just the global dict. We need it all, especially to support future use-cases where runs of a workflow can be stepwise, take days, have undo/rewind, etc.
This will require figuring out how to do this in llama-index itself first of course
The text was updated successfully, but these errors were encountered:
I'm looking first to a simple case of managing a session-level context in the WorkflowService. I see a framework is in place but not yet implemented. The basic idea is easy, create a context for each session and persist it (in-memory initially, eventually on storage).
The simple approach is to wire that in to the existing Workflow Service or a subclass, but I wondering if the notion of a separate ContextManager class is appropriate? I'll probably hold that in store as a refactoring.
Currently, the code that was supposed to maintain context/state between runs is all commented out.
We need to bring back this feature, by figuring out how to serialize the context. That means, we need to serialize
In this case, I don't think it's enough to maintain just the global dict. We need it all, especially to support future use-cases where runs of a workflow can be stepwise, take days, have undo/rewind, etc.
This will require figuring out how to do this in
llama-index
itself first of courseThe text was updated successfully, but these errors were encountered: