Skip to content

🐛 [Bug] Softmax in nn.MultiheadAttention layer not fused with torch.compile backend #2127

🐛 [Bug] Softmax in nn.MultiheadAttention layer not fused with torch.compile backend

🐛 [Bug] Softmax in nn.MultiheadAttention layer not fused with torch.compile backend #2127

Triggered via issue August 25, 2023 21:19
@gs-olivegs-olive
commented on #2267 a65c95c
Status Skipped
Total duration 5s
Artifacts

blossom-ci.yml

on: issue_comment
Authorization
0s
Authorization
Upload log
0s
Upload log
Vulnerability scan
0s
Vulnerability scan
Start ci job
0s
Start ci job
Fit to window
Zoom out
Zoom in