-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix Nonetype attribute error when loading multiple Flux loras #10182
Conversation
module_bias = module.bias.data if hasattr(module, "bias") else None | ||
module_bias = module.bias.data if module.bias is not None else None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's make this more explicit:
module_bias = None
if getattr(module, "bias", None) is not None:
module_bias = module.bias.data
WDYT? Additionally, do you think we should add a test for this? When we merged the Control LoRA PR, we did run all the integration tests too and we didn't face an issue.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since this is under an if statement where we know the following layer is nn.Linear
, we know that there is already a module.bias
parameter. So I think the current condition is okay.
Yes, the integration tests when I last tested did not fail either. I think we can investigate again.
We should add one more fast test here IMO that tests for a failure when:
- First lora loaded expands the shape
- Second lora is of normal shape without the expansion
This usecase is not supported yet, so until it is, we expect that an error will be raised.
Another fast test that we should probably have is loading lora into nn.Linear with/without bias present, for a total of 4 tests (lora with/without, linear with/without - some are already covered with existing test suite).
WDYT @sayakpaul? We can do this in a separate PR since the change here is minimal and the actual correct thing to do
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Works for me but let's high-prioritize the tests then (immediately after this merge) given how important LoRAs are.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds good. I'll open a PR after merging this with the mentioned tests
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks so much! I have a single comment.
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for spotting this and fixing! 🤗
Okay for me to merge as is |
What does this PR do?
Fixes #10180 to allow loading multiple loras (issue originally introduced by #9999)
Before submitting
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@a-r-r-o-w