Skip to content

[Fix] Support actual seqlen in flash-attention2 #83

[Fix] Support actual seqlen in flash-attention2

[Fix] Support actual seqlen in flash-attention2 #83

Triggered via pull request September 18, 2023 10:58
Status Success
Total duration 56m 25s
Artifacts
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

linux-x64-gpu.yml

on: pull_request
Fit to window
Zoom out
Zoom in