You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We need the backend to generate a text transcript of the session for use with LLMs. This is similar to how the frontend generates a UI timeline/history of events, but this is done on the backend and has a couple of specific requirements:
not structured data, its text (though could have a structured intermediate representation if needed)
needs to have a definable "window size" that can be adjusted based on the LLM context size (usually in tokens, but could be approximated in characters)
needs to filter out (or have a statement indicating it was redacted) other assistant responses
We would use this as the input prompt for LLM assistant queries, though for now I think we should just send the immediate query for function choosing ("bridge, please add 10 and 20").
The text was updated successfully, but these errors were encountered:
We need the backend to generate a text transcript of the session for use with LLMs. This is similar to how the frontend generates a UI timeline/history of events, but this is done on the backend and has a couple of specific requirements:
We would use this as the input prompt for LLM assistant queries, though for now I think we should just send the immediate query for function choosing ("bridge, please add 10 and 20").
The text was updated successfully, but these errors were encountered: