-
Notifications
You must be signed in to change notification settings - Fork 45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
include reasoning tokens in ui #802
Comments
Could you elaborate on this one, I would like to learn the entire idea 💡 |
We can render messages using https://developer.mozilla.org/en-US/docs/Web/HTML/Element/details <details>
|reasoning_content from r1 model(we dont parse these out atm)|
<summary>
content from r1(same as what we see now)
</summary>
</details Eg
|reasoning_content from r1 model(we dont parse these out atm)|
content from r1(same as what we see now) |
@tarasglek sounds interesting, may I try? |
Just now found time to try new DeepSeek, their deep thinking option is impressive, way better than OpenAI's o1 model has. |
Here's my proposal:
Questions: |
I would put any UI for selecting/de-selecting this into the Preferences Modal vs. adding to the prompt area, which is already too busy. |
I think we should just add a thinking feature to our data model. Eg add a .reasoning_content like they do to our messages and when that's present show that in ui during streaming and collapse it afterward. |
deepseek returns https://api-docs.deepseek.com/guides/reasoning_model reasoning tokens. We should use html details/summary feature for this
openrouter is gonna support this for all reasoning models
this is gonna be interesting for also explicitly including reasoning context when switching models to do function calls, etc that reasoning models suck at
The text was updated successfully, but these errors were encountered: