Skip to content

[Fix] Support actual seqlen in flash-attention2 #80

[Fix] Support actual seqlen in flash-attention2

[Fix] Support actual seqlen in flash-attention2 #80

Triggered via pull request September 18, 2023 05:57
Status Success
Total duration 42m 5s
Artifacts
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

linux-x64-gpu.yml

on: pull_request
Fit to window
Zoom out
Zoom in