Skip to content

flux fill cannot use lora(flux turbo lora) #10184

Closed
@Suprhimp

Description

@Suprhimp

Describe the bug

I want to use flux fill pipeline with turbo lora, but when I load pipeline and load lora model, than gives error

Reproduction

from diffusers import FluxFillPipeline

def model_fn(model_dir: str) -> FluxFillPipeline:

    pipe = FluxFillPipeline.from_pretrained(
       "black-forest-labs/FLUX.1-Fill-dev", torch_dtype=torch.bfloat16
    ).to("cuda")
    
    pipe.load_lora_weights(f"alimama-creative/FLUX.1-Turbo-Alpha")
    pipe.fuse_lora()
    
    
    return pipe

Logs

NotImplementedError: Only LoRAs with input/output features higher than the current module's input/output features are currently supported. The provided LoRA contains in_features=64 and out_features=3072, which are lower than module_in_features=384 and module_out_features=3072. If you require support for this please open an issue at https://github.com/huggingface/diffusers/issues.

System Info

latest(github version diffusers), python3.10, ubuntu with nvidia gpu

Who can help?

@sayakpaul

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinglora

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions