Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(internevo): mock flash_attn_2_cuda #54

Merged
merged 1 commit into from
Mar 13, 2024
Merged

fix(internevo): mock flash_attn_2_cuda #54

merged 1 commit into from
Mar 13, 2024

Conversation

lljbash
Copy link
Collaborator

@lljbash lljbash commented Mar 13, 2024

InternEVO uses FlashAttention 2.2.1, where the CUDA module is renamed from flash_attn_cuda to flash_attn_2_cuda. This commit mocks the correct module name.

InternEVO uses FlashAttention 2.2.1, where the CUDA module is renamed from
flash_attn_cuda to flash_attn_2_cuda. This commit mocks the correct module
name.
@lljbash lljbash added the bug Something isn't working label Mar 13, 2024
@lljbash lljbash self-assigned this Mar 13, 2024
@lljbash lljbash requested a review from wiryls March 13, 2024 07:55
@lljbash lljbash merged commit 0528ab2 into main Mar 13, 2024
4 checks passed
@lljbash lljbash deleted the internevo branch March 13, 2024 10:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants