Skip to content

Commit

Permalink
load t5 model in the same format as it is saved, seems to load as flo…
Browse files Browse the repository at this point in the history
…at32 on Macs
  • Loading branch information
Vargol authored and psychedelicious committed Oct 22, 2024
1 parent d85733f commit 1c9fc0a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion invokeai/backend/model_manager/load/model_loaders/flux.py
Original file line number Diff line number Diff line change
Expand Up @@ -175,7 +175,7 @@ def _load_model(
case SubModelType.Tokenizer2:
return T5Tokenizer.from_pretrained(Path(config.path) / "tokenizer_2", max_length=512)
case SubModelType.TextEncoder2:
return T5EncoderModel.from_pretrained(Path(config.path) / "text_encoder_2")
return T5EncoderModel.from_pretrained(Path(config.path) / "text_encoder_2", torch_dtype='auto')

raise ValueError(
f"Only Tokenizer and TextEncoder submodels are currently supported. Received: {submodel_type.value if submodel_type else 'None'}"
Expand Down

0 comments on commit 1c9fc0a

Please sign in to comment.