-
Notifications
You must be signed in to change notification settings - Fork 0
fix: now reasoning output is rendered in the UI #29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
079b6b1
to
42cb73b
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR updates the chat API to support streaming reasoning output and adjusts UI components to render that output progressively.
- Enables streaming of reasoning output when the model starts with "o3"/"o4"
- Buffers reasoning summary updates and flushes them upon completion
- Replaces ReactMarkdown with a custom MarkdownContent component in chat-related UI components
Reviewed Changes
Copilot reviewed 6 out of 6 changed files in this pull request and generated 2 comments.
Show a summary per file
File | Description |
---|---|
src/routes/api/chat.ts | Adds reasoning configuration to API request based on model prefix |
src/lib/streaming.ts | Buffers and flushes reasoning summary text events in the stream |
src/components/ReasoningMessage.tsx | Creates a dedicated component to render reasoning messages |
src/components/MarkdownContent.tsx | Extracts markdown rendering into a reusable component |
src/components/ChatMessage.tsx | Replaces direct ReactMarkdown usage with MarkdownContent |
src/components/Chat.tsx | Incorporates streaming reasoning events into the chat UI |
1ff6422
to
d60c113
Compare
This is good to go. Do you still get crashes with this @wasaga? I was never able to replicate this with the reported crashes. Also, for
it's already set to |
d60c113
to
4c5b4f7
Compare
Now reasoning output is rendered in the UI. Currently all the chunks for a reasoning summary get flushed at the same time. We can probably stream those in.
Closes #16
CleanShot.2025-05-29.at.22.21.51.mp4
Notes from @wasaga:
Note it already has
streaming: true
.