Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Misc] Minor patch for draft model runner #6523

Merged
merged 1 commit into from
Jul 18, 2024

Conversation

comaniac
Copy link
Collaborator

Add comments for the global debugging flags. patching #6338.

cc @cadedaniel @alexm-neuralmagic

Copy link

👋 Hi! Thank you for contributing to the vLLM project.
Just a reminder: PRs would not trigger full CI run by default. Instead, it would only trigger fastcheck CI to run, which consists only a small and essential subset of tests to quickly catch errors with the flexibility to run extra individual tests on top (you can do this by unblocking test steps in the Buildkite run).

Full CI run is still required to merge this PR so once the PR is ready to go, please make sure to run it. If you need all test signals in between PR commits, you can trigger full CI as well.

To run full CI, you can do one of these:

  • Comment /ready on the PR
  • Add ready label to the PR
  • Enable auto-merge.

🚀

@comaniac comaniac added the ready ONLY add when PR is ready to merge/full CI is needed label Jul 17, 2024
@Yard1 Yard1 changed the title [Mise] Miner patch for draft model runner [Mise] Minor patch for draft model runner Jul 17, 2024
@Yard1 Yard1 changed the title [Mise] Minor patch for draft model runner [Misc] Minor patch for draft model runner Jul 17, 2024
@cadedaniel cadedaniel enabled auto-merge (squash) July 18, 2024 05:39
@cadedaniel cadedaniel merged commit 8a74c68 into vllm-project:main Jul 18, 2024
87 checks passed
fialhocoelho pushed a commit to opendatahub-io/vllm that referenced this pull request Jul 19, 2024
xjpang pushed a commit to xjpang/vllm that referenced this pull request Jul 24, 2024
gnpinkert pushed a commit to gnpinkert/vllm that referenced this pull request Jul 26, 2024
Alvant pushed a commit to compressa-ai/vllm that referenced this pull request Oct 26, 2024
KuntaiDu pushed a commit to KuntaiDu/vllm that referenced this pull request Nov 20, 2024
@comaniac comaniac deleted the quick-patch branch January 3, 2025 21:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants