Skip to content

Commit

Permalink
Fix Nonetype attribute error when loading multiple Flux loras (#10182)
Browse files Browse the repository at this point in the history
Fix Nonetype attribute error
  • Loading branch information
jonathanyin12 authored Dec 11, 2024
1 parent 43534a8 commit 0967593
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/diffusers/loaders/lora_pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -2313,7 +2313,7 @@ def _maybe_expand_transformer_param_shape_or_error_(
for name, module in transformer.named_modules():
if isinstance(module, torch.nn.Linear):
module_weight = module.weight.data
module_bias = module.bias.data if hasattr(module, "bias") else None
module_bias = module.bias.data if module.bias is not None else None
bias = module_bias is not None

lora_A_weight_name = f"{name}.lora_A.weight"
Expand Down

0 comments on commit 0967593

Please sign in to comment.