-
-
Notifications
You must be signed in to change notification settings - Fork 5k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[Bugfix] Fix fully sharded LoRAs with Mixtral
- Changes ReplicatedLinearWithLoRA to always apply regardless of the fully sharded LoRA setting, since in both cases the layer needs to be replicated - Updates the existing mixtral all modeuls test to test both values of fully_sharded_loras (which includes a ReplicatedLayer [gate]) Signed-off-by: Jason Greene <[email protected]>
- Loading branch information
Showing
2 changed files
with
5 additions
and
2 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters