Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bugfix] Steaming continuous_usage_stats default to False #9709

Merged

Conversation

samos123
Copy link
Contributor

This is needed to be compatible with OpenAI API by default.

FIX #7262

Copy link

👋 Hi! Thank you for contributing to the vLLM project.
Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can do one of these:

  • Add ready label to the PR
  • Enable auto-merge.

🚀

This is needed to be compatible with OpenAI API by default.

Fixes vllm-project#7262

Signed-off-by: Sam Stoelinga <[email protected]>
@samos123 samos123 force-pushed the fix-7262-disable-usage-streaming branch from 8d0e013 to 25a4a08 Compare October 26, 2024 01:50
Copy link
Member

@DarkLight1337 DarkLight1337 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for helping out!

@DarkLight1337 DarkLight1337 enabled auto-merge (squash) October 26, 2024 03:24
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Oct 26, 2024
@DarkLight1337 DarkLight1337 merged commit 067e77f into vllm-project:main Oct 26, 2024
69 checks passed
MErkinSag pushed a commit to MErkinSag/vllm that referenced this pull request Oct 26, 2024
MErkinSag pushed a commit to MErkinSag/vllm that referenced this pull request Oct 26, 2024
cooleel pushed a commit to cooleel/vllm that referenced this pull request Oct 28, 2024
cooleel pushed a commit to cooleel/vllm that referenced this pull request Oct 28, 2024
FerdinandZhong pushed a commit to FerdinandZhong/vllm that referenced this pull request Oct 29, 2024
rasmith pushed a commit to rasmith/vllm that referenced this pull request Oct 30, 2024
NickLucche pushed a commit to NickLucche/vllm that referenced this pull request Oct 31, 2024
NickLucche pushed a commit to NickLucche/vllm that referenced this pull request Oct 31, 2024
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Nov 4, 2024
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Nov 4, 2024
sumitd2 pushed a commit to sumitd2/vllm that referenced this pull request Nov 14, 2024
KuntaiDu pushed a commit to KuntaiDu/vllm that referenced this pull request Nov 20, 2024
mfournioux pushed a commit to mfournioux/vllm that referenced this pull request Nov 20, 2024
tlrmchlsmth pushed a commit to neuralmagic/vllm that referenced this pull request Nov 23, 2024
sleepwalker2017 pushed a commit to sleepwalker2017/vllm that referenced this pull request Dec 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: stream_options.include_usage being retrieved on every chunk
2 participants