Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

General support of i1 mask in attention #19380

Open
lialan opened this issue Dec 5, 2024 · 1 comment
Open

General support of i1 mask in attention #19380

lialan opened this issue Dec 5, 2024 · 1 comment
Assignees
Labels
enhancement ➕ New feature or request

Comments

@lialan
Copy link
Contributor

lialan commented Dec 5, 2024

Request description

  • codegen support for general i1 masks in attention with packed i1 memory layout
  • i1 mask attention e2e tests
  • a new encoding attribute to indicate the memory layout of a tensor.

What component(s) does this issue relate to?

  • Frontend: to support emitting packed i1 tensor
  • Codegen: to support i1 datatype as mask without upcasting/downcasting

Additional context

No response

@lialan lialan added the enhancement ➕ New feature or request label Dec 5, 2024
@lialan lialan self-assigned this Dec 5, 2024
@lialan
Copy link
Contributor Author

lialan commented Dec 5, 2024

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement ➕ New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant