We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
inductor may pad mm for some shapes i.e replace matmul with zeros, cat, matmul and slice. On A100 fp16 fp_GPT2, this feature can improve 30% performance. Check if it's useful on XPU. https://github.com/pytorch/pytorch/blob/main/torch/_inductor/fx_passes/pad_mm.py
The text was updated successfully, but these errors were encountered:
jianyizh
No branches or pull requests
inductor may pad mm for some shapes i.e replace matmul with zeros, cat, matmul and slice. On A100 fp16 fp_GPT2, this feature can improve 30% performance. Check if it's useful on XPU. https://github.com/pytorch/pytorch/blob/main/torch/_inductor/fx_passes/pad_mm.py
The text was updated successfully, but these errors were encountered: