Skip to content

[Fix] Support actual seqlen in flash-attention2 #78

[Fix] Support actual seqlen in flash-attention2

[Fix] Support actual seqlen in flash-attention2 #78

Triggered via pull request September 15, 2023 03:30
Status Success
Total duration 55m 44s
Artifacts
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

linux-x64-gpu.yml

on: pull_request
Fit to window
Zoom out
Zoom in