You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@MiladInk We haven't added support yet but it shouldn't be too hard. I've also been meaning to add vmap support for scaled_dot_product_attention... might do it this weekend.
I know the
scaled_dot_product_attention
oftorch.nn.functional
does not and it is a big problem for me. Does FlexAttention support batched mode?The text was updated successfully, but these errors were encountered: