Skip to content

feat(pytorch): Allow TransformerLayer and MultiheadAttention to accept sequence length parameters #3657

feat(pytorch): Allow TransformerLayer and MultiheadAttention to accept sequence length parameters

feat(pytorch): Allow TransformerLayer and MultiheadAttention to accept sequence length parameters #3657

Triggered via issue August 16, 2024 20:46
@ptrendxptrendx
commented on #1066 3040785
Status Skipped
Total duration 4s
Artifacts

blossom-ci.yml

on: issue_comment
Authorization
0s
Authorization
Upload log
0s
Upload log
Vulnerability scan
0s
Vulnerability scan
Start ci job
0s
Start ci job
Fit to window
Zoom out
Zoom in