Skip to content

[Fix] Support actual seqlen in flash-attention2 #81

[Fix] Support actual seqlen in flash-attention2

[Fix] Support actual seqlen in flash-attention2 #81

Triggered via pull request September 18, 2023 10:30
Status Cancelled
Total duration 24m 44s
Artifacts
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

linux-x64-gpu.yml

on: pull_request
Fit to window
Zoom out
Zoom in

Annotations

2 errors
cuda-118
Canceling since a higher priority waiting request for 'linux-x64-gpu-refs/pull/418/merge' exists
cuda-118
The operation was canceled.