Skip to content

Update Flash-Attention to v2.5.6 (fairinternal/xformers#1044) #5

Update Flash-Attention to v2.5.6 (fairinternal/xformers#1044)

Update Flash-Attention to v2.5.6 (fairinternal/xformers#1044) #5

Triggered via push March 5, 2024 07:19
Status Failure
Total duration 1d 12h 55m 11s
Artifacts

conda.yml

on: push
Matrix: build
Fit to window
Zoom out
Zoom in

Annotations

4 errors
py3.10-torch2.2.0-cu11.8.0
This request was automatically failed because there were no enabled runners online to process the request for more than 1 days.
py3.9-torch2.2.0-cu11.8.0
This request was automatically failed because there were no enabled runners online to process the request for more than 1 days.
py3.10-torch2.2.0-cu12.1.0
This request was automatically failed because there were no enabled runners online to process the request for more than 1 days.
py3.9-torch2.2.0-cu12.1.0
This request was automatically failed because there were no enabled runners online to process the request for more than 1 days.