Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

include reasoning tokens in ui #802

Open
tarasglek opened this issue Jan 26, 2025 · 7 comments
Open

include reasoning tokens in ui #802

tarasglek opened this issue Jan 26, 2025 · 7 comments
Assignees
Labels
enhancement New feature or request

Comments

@tarasglek
Copy link
Owner

tarasglek commented Jan 26, 2025

deepseek returns https://api-docs.deepseek.com/guides/reasoning_model reasoning tokens. We should use html details/summary feature for this

openrouter is gonna support this for all reasoning models

this is gonna be interesting for also explicitly including reasoning context when switching models to do function calls, etc that reasoning models suck at

@mulla028
Copy link
Collaborator

Could you elaborate on this one, I would like to learn the entire idea 💡

@tarasglek
Copy link
Owner Author

tarasglek commented Jan 26, 2025

We can render messages using https://developer.mozilla.org/en-US/docs/Web/HTML/Element/details

<details>
|reasoning_content from r1 model(we dont parse these out atm)|
<summary>
content from r1(same as what we see now)
</summary>
</details

Eg

|reasoning_content from r1 model(we dont parse these out atm)| content from r1(same as what we see now)

@mulla028 mulla028 added the enhancement New feature or request label Jan 28, 2025
@mulla028
Copy link
Collaborator

@tarasglek sounds interesting, may I try?

@mulla028
Copy link
Collaborator

mulla028 commented Jan 30, 2025

Just now found time to try new DeepSeek, their deep thinking option is impressive, way better than OpenAI's o1 model has.

@mulla028
Copy link
Collaborator

mulla028 commented Jan 30, 2025

Here's my proposal:

  • Add a checkbox or something like this, therefore, user could see reasoning tokens
  • If checkbox unchecked, user receives regular fast response without deep thinking

Questions:
I assume that isn't support by every model, how may I release it since user may choose any desirable model?

@humphd
Copy link
Collaborator

humphd commented Jan 31, 2025

I would put any UI for selecting/de-selecting this into the Preferences Modal vs. adding to the prompt area, which is already too busy.

@tarasglek
Copy link
Owner Author

I think we should just add a thinking feature to our data model. Eg add a .reasoning_content like they do to our messages and when that's present show that in ui during streaming and collapse it afterward.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants