-
Notifications
You must be signed in to change notification settings - Fork 5.5k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* split_head_dim flax attn * Make split_head_dim non default * make style and make quality * add description for split_head_dim flag * Update src/diffusers/models/attention_flax.py Co-authored-by: Patrick von Platen <[email protected]> --------- Co-authored-by: Juan Acevedo <[email protected]> Co-authored-by: Patrick von Platen <[email protected]>
- Loading branch information
1 parent
c82f7ba
commit 16d56c4
Showing
1 changed file
with
25 additions
and
7 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
16d56c4
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey there -- I think this change may introduce a bug, as removing self.reshape_batch_dim_to_heads in line 216 means it will not be called when self.use_memory_efficient_attention is True and self.split_head_dim is False.