Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[PyTorch] Implement Fp8 padding and unpadding module #1129

Merged

Commits on Aug 30, 2024

  1. [PyTorch] Add FP8 padding and unpaading module

     1. Add multi-tensor padding kernel
     2. Add FP8Padding and Fp8Unpadding module
     3. Add padding grouped linear UT case
    
    Signed-off-by: beinggod <[email protected]>
    beinggod committed Aug 30, 2024
    Configuration menu
    Copy the full SHA
    c5fd1b4 View commit details
    Browse the repository at this point in the history

Commits on Aug 31, 2024

  1. Configuration menu
    Copy the full SHA
    4cef386 View commit details
    Browse the repository at this point in the history

Commits on Sep 4, 2024

  1. refine ut & simplify multi-padding kernel

    Signed-off-by: beinggod <[email protected]>
    beinggod committed Sep 4, 2024
    Configuration menu
    Copy the full SHA
    27d8082 View commit details
    Browse the repository at this point in the history

Commits on Sep 5, 2024

  1. Configuration menu
    Copy the full SHA
    a9df151 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    59c5021 View commit details
    Browse the repository at this point in the history