Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[torch.compile] Add torch inductor pass for fusing silu_and_mul with subsequent scaled_fp8_quant operations #10867

Open
wants to merge 35 commits into
base: main
Choose a base branch
from

Conversation

SageMoore
Copy link
Contributor

@SageMoore SageMoore commented Dec 3, 2024

Credit to @LucasWilkinson for the kernel.

This pass currently only supports static per-tensor quantization. Other quantization schemes will be included in a subsequent PRs.

I've attached some QPS sweeps that were run using neuralmagic/Meta-Llama-3.1-8B-Instruct-FP8 on an H100. Generally speaking, this pass improves the TPOT of FP8 Llama by 2-3%. There are similar improvements with TTFT with the exception of 20QPS which is much (~2x) faster.

fused_results
torch_compile_results

Copy link

github-actions bot commented Dec 3, 2024

👋 Hi! Thank you for contributing to the vLLM project.
Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can do one of these:

  • Add ready label to the PR
  • Enable auto-merge.

🚀

@mergify mergify bot added the ci/build label Dec 3, 2024
Signed-off-by: Sage Moore <[email protected]>
Signed-off-by: Sage Moore <[email protected]>
Signed-off-by: Sage Moore <[email protected]>
Signed-off-by: Sage Moore <[email protected]>
Signed-off-by: Sage Moore <[email protected]>
Signed-off-by: Sage Moore <[email protected]>
Signed-off-by: Sage Moore <[email protected]>
Signed-off-by: Sage Moore <[email protected]>
Signed-off-by: Sage Moore <[email protected]>
Signed-off-by: Sage Moore <[email protected]>
Signed-off-by: Sage Moore <[email protected]>
Signed-off-by: Sage Moore <[email protected]>
Signed-off-by: Sage Moore <[email protected]>
@SageMoore SageMoore force-pushed the sage/silu-mul-quant branch from 27be0bd to e2fda7f Compare December 6, 2024 20:34
@SageMoore SageMoore marked this pull request as ready for review December 6, 2024 20:36
Signed-off-by: Sage Moore <[email protected]>
Signed-off-by: Sage Moore <[email protected]>
Copy link
Collaborator

@tlrmchlsmth tlrmchlsmth left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Focused on csrc/quantization/activation_kernels.cu. spotted a couple of potential int32_t overflows

csrc/core/math.hpp Outdated Show resolved Hide resolved
csrc/quantization/activation_kernels.cu Outdated Show resolved Hide resolved
csrc/quantization/activation_kernels.cu Show resolved Hide resolved
csrc/quantization/activation_kernels.cu Show resolved Hide resolved
Copy link

mergify bot commented Dec 18, 2024

This pull request has merge conflicts that must be resolved before it can be
merged. Please rebase the PR, @SageMoore.

https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork

@mergify mergify bot added the needs-rebase label Dec 18, 2024
Copy link
Collaborator

@tlrmchlsmth tlrmchlsmth left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A couple more comments - LGTM if we can support non power of two hidden sizes

tests/compile/test_functionalization.py Outdated Show resolved Hide resolved
tests/compile/test_functionalization.py Outdated Show resolved Hide resolved
tests/kernels/test_fused_quant_activation.py Outdated Show resolved Hide resolved
csrc/quantization/activation_kernels.cu Show resolved Hide resolved
csrc/quantization/activation_kernels.cu Show resolved Hide resolved
Copy link

mergify bot commented Dec 19, 2024

This pull request has merge conflicts that must be resolved before it can be
merged. Please rebase the PR, @SageMoore.

https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork

@mergify mergify bot added the needs-rebase label Dec 19, 2024
@mergify mergify bot removed the needs-rebase label Dec 19, 2024
@SageMoore
Copy link
Contributor Author

Apologies for the noise. I accidentally added my signature to a bunch of irrelevant commits which pulled them into the PR temporarily. Things should be sorted now.

Signed-off-by: Sage Moore <[email protected]>
csrc/quantization/activation_kernels.cu Outdated Show resolved Hide resolved
csrc/quantization/activation_kernels.cu Show resolved Hide resolved
} // namespace vllm

// Launch activation, gating, and quantize kernel.
#define LAUNCH_ACTIVATION_GATE_KERNEL(KERNEL) \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a reason this needs a macro?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just copied what the existing act_and_mul kernel does. This allows us to just drop in kernels for the other activation functions. I'm in favor of keeping it.

tests/compile/test_functionalization.py Outdated Show resolved Hide resolved
Copy link
Member

@youkaichao youkaichao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

glad to see more fusion passes, I will hand it over to @tlrmchlsmth and @ProExpertProg for detailed review.

Signed-off-by: Sage Moore <[email protected]>
Signed-off-by: Sage Moore <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ci/build frontend ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants