Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prompts should be grouped #419

Closed
lukehinds opened this issue Dec 19, 2024 · 1 comment · Fixed by #546
Closed

Prompts should be grouped #419

lukehinds opened this issue Dec 19, 2024 · 1 comment · Fixed by #546
Assignees

Comments

@lukehinds
Copy link
Contributor

Currently prompts are tracked as singular conversations, where as they are often a chain of conversations. We should move to displaying chains, this is of key importance, as systems such as copilot edit, cursor composer and others are highly leveraging chained instructions

@yrobla
Copy link
Contributor

yrobla commented Jan 8, 2025

@lukehinds can you clarify more about that? the idea is that all the requests that come into a single chat session, are displayed together into the ui?

@aponcedeleonch aponcedeleonch self-assigned this Jan 9, 2025
aponcedeleonch added a commit that referenced this issue Jan 10, 2025
Closes: #419

Before we could use the `chat_id` at the output messages as means
to group the messages into conversations. This logic is not working
anymore.

The new logic takes into account the user messages provided as
input to the LLM to map the messages into conversations. Usually LLMs
receive all last user messages. Example:
```
req1 = {messages:[{"role": "user", "content": "hello"}]}
req2 = {messages:[{"role": "user", "content": "hello"}, {"role": "user", "content": "how are you?}]}
```

In this last example, `req1` and `req2` should be mapped together to
form a conversation
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants