Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: support flash attention for InternLM on ascend #41

Closed
wants to merge 18 commits into from

Conversation

POI-WX
Copy link
Collaborator

@POI-WX POI-WX commented Jan 29, 2024

No description provided.

@lljbash lljbash marked this pull request as draft January 30, 2024 04:33
@POI-WX POI-WX changed the title draft for supporting flash attention draft for supporting flash attention on ascend Mar 14, 2024
@POI-WX POI-WX force-pushed the wx/support_flash_attention_for_ascend branch from 204bd1b to 5bc1450 Compare March 18, 2024 08:54
@POI-WX POI-WX changed the title draft for supporting flash attention on ascend draft for supporting flash attention for internlm on ascend Mar 19, 2024
@POI-WX POI-WX changed the title draft for supporting flash attention for internlm on ascend feat: add support of flash attention for InternLM on ascend Mar 19, 2024
@POI-WX POI-WX changed the title feat: add support of flash attention for InternLM on ascend feat: support flash attention for InternLM on ascend Mar 19, 2024
@POI-WX POI-WX marked this pull request as ready for review March 19, 2024 09:18
@POI-WX POI-WX closed this Apr 5, 2024
@POI-WX POI-WX reopened this Apr 8, 2024
@POI-WX POI-WX closed this Apr 22, 2024
@POI-WX POI-WX deleted the wx/support_flash_attention_for_ascend branch April 24, 2024 04:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant