Skip to content

fix: now reasoning output is rendered in the UI #29

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

nickytonline
Copy link
Member

@nickytonline nickytonline commented May 30, 2025

Now reasoning output is rendered in the UI. Currently all the chunks for a reasoning summary get flushed at the same time. We can probably stream those in.

Closes #16

CleanShot.2025-05-29.at.22.21.51.mp4

CleanShot 2025-05-30 at 13 05 24@2x

Notes from @wasaga:

I think we might need to spawn this task with background: true, streaming: true, so that it could run in background - cause i think o3 for example has quotas that it may hit otherwise

Note it already has streaming: true.

@nickytonline nickytonline force-pushed the nickytonline/reasoning-model-output branch from 079b6b1 to 42cb73b Compare May 30, 2025 03:09
@nickytonline nickytonline changed the title feat: now reasoning output is rendered in the UI fix: now reasoning output is rendered in the UI May 30, 2025
@nickytonline nickytonline requested a review from wasaga May 30, 2025 03:13
@nickytonline nickytonline marked this pull request as ready for review May 30, 2025 03:13
@nickytonline nickytonline requested a review from Copilot May 30, 2025 03:27
Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR updates the chat API to support streaming reasoning output and adjusts UI components to render that output progressively.

  • Enables streaming of reasoning output when the model starts with "o3"/"o4"
  • Buffers reasoning summary updates and flushes them upon completion
  • Replaces ReactMarkdown with a custom MarkdownContent component in chat-related UI components

Reviewed Changes

Copilot reviewed 6 out of 6 changed files in this pull request and generated 2 comments.

Show a summary per file
File Description
src/routes/api/chat.ts Adds reasoning configuration to API request based on model prefix
src/lib/streaming.ts Buffers and flushes reasoning summary text events in the stream
src/components/ReasoningMessage.tsx Creates a dedicated component to render reasoning messages
src/components/MarkdownContent.tsx Extracts markdown rendering into a reusable component
src/components/ChatMessage.tsx Replaces direct ReactMarkdown usage with MarkdownContent
src/components/Chat.tsx Incorporates streaming reasoning events into the chat UI

@nickytonline nickytonline force-pushed the nickytonline/reasoning-model-output branch 2 times, most recently from 1ff6422 to d60c113 Compare June 3, 2025 03:33
@nickytonline
Copy link
Member Author

This is good to go. Do you still get crashes with this @wasaga? I was never able to replicate this with the reported crashes.

Also, for

I think we might need to spawn this task with background: true, streaming: true, so that it could run in background - cause i think o3 for example has quotas that it may hit otherwise

it's already set to streaming: true, but do we still want to spawn this task with background: true or do that in a follow up PR?

@nickytonline nickytonline force-pushed the nickytonline/reasoning-model-output branch from d60c113 to 4c5b4f7 Compare June 13, 2025 22:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

reasoning models output are not supported
1 participant