Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Nonetype attribute error when loading multiple Flux loras #10182

Merged
merged 1 commit into from
Dec 11, 2024

Conversation

jonathanyin12
Copy link
Contributor

@jonathanyin12 jonathanyin12 commented Dec 11, 2024

What does this PR do?

Fixes #10180 to allow loading multiple loras (issue originally introduced by #9999)

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@a-r-r-o-w

module_bias = module.bias.data if hasattr(module, "bias") else None
module_bias = module.bias.data if module.bias is not None else None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's make this more explicit:

module_bias = None
if getattr(module, "bias", None) is not None:
    module_bias = module.bias.data

WDYT? Additionally, do you think we should add a test for this? When we merged the Control LoRA PR, we did run all the integration tests too and we didn't face an issue.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since this is under an if statement where we know the following layer is nn.Linear, we know that there is already a module.bias parameter. So I think the current condition is okay.

Yes, the integration tests when I last tested did not fail either. I think we can investigate again.

We should add one more fast test here IMO that tests for a failure when:

  • First lora loaded expands the shape
  • Second lora is of normal shape without the expansion

This usecase is not supported yet, so until it is, we expect that an error will be raised.

Another fast test that we should probably have is loading lora into nn.Linear with/without bias present, for a total of 4 tests (lora with/without, linear with/without - some are already covered with existing test suite).

WDYT @sayakpaul? We can do this in a separate PR since the change here is minimal and the actual correct thing to do

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works for me but let's high-prioritize the tests then (immediately after this merge) given how important LoRAs are.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good. I'll open a PR after merging this with the mentioned tests

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks so much! I have a single comment.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Member

@a-r-r-o-w a-r-r-o-w left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for spotting this and fixing! 🤗

@sayakpaul
Copy link
Member

Okay for me to merge as is

@a-r-r-o-w a-r-r-o-w merged commit 0967593 into huggingface:main Dec 11, 2024
15 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Can't load multiple loras when using Flux Control LoRA
4 participants