You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently prompts are tracked as singular conversations, where as they are often a chain of conversations. We should move to displaying chains, this is of key importance, as systems such as copilot edit, cursor composer and others are highly leveraging chained instructions
The text was updated successfully, but these errors were encountered:
@lukehinds can you clarify more about that? the idea is that all the requests that come into a single chat session, are displayed together into the ui?
Closes: #419
Before we could use the `chat_id` at the output messages as means
to group the messages into conversations. This logic is not working
anymore.
The new logic takes into account the user messages provided as
input to the LLM to map the messages into conversations. Usually LLMs
receive all last user messages. Example:
```
req1 = {messages:[{"role": "user", "content": "hello"}]}
req2 = {messages:[{"role": "user", "content": "hello"}, {"role": "user", "content": "how are you?}]}
```
In this last example, `req1` and `req2` should be mapped together to
form a conversation
Currently prompts are tracked as singular conversations, where as they are often a chain of conversations. We should move to displaying chains, this is of key importance, as systems such as copilot edit, cursor composer and others are highly leveraging chained instructions
The text was updated successfully, but these errors were encountered: