Skip to content

[Fix] Support actual seqlen in flash-attention2 #82

[Fix] Support actual seqlen in flash-attention2

[Fix] Support actual seqlen in flash-attention2 #82

Triggered via pull request September 18, 2023 10:54
Status Cancelled
Total duration 4m 2s
Artifacts
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

linux-x64-gpu.yml

on: pull_request
Fit to window
Zoom out
Zoom in

Annotations

2 errors
cuda-118
Canceling since a higher priority waiting request for 'linux-x64-gpu-refs/pull/418/merge' exists
cuda-118
The operation was canceled.