Skip to content

Commit b76c987

Browse files
authored
[https://nvbugs/5467232][fix] Fix load_torch_hf_lora to override lora_config.trtllm_modules_to_hf_modules with default only when it has no value (#7168)
Signed-off-by: Wanli Jiang <[email protected]>
1 parent 01c5f2f commit b76c987

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

tensorrt_llm/lora_manager.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -456,7 +456,8 @@ def load_torch_hf_lora(lora_config: LoraConfig):
456456
pivot model config is the transformer's one.
457457
"""
458458
# TODO smor- need to comibe with load_hf_lora
459-
lora_config.trtllm_modules_to_hf_modules = get_default_trtllm_modules_to_hf_modules()
459+
if not lora_config.trtllm_modules_to_hf_modules:
460+
lora_config.trtllm_modules_to_hf_modules = get_default_trtllm_modules_to_hf_modules()
460461

461462
assert len(lora_config.lora_dir) == 1, "Expecting only a single lora dir"
462463
lora_loader = HfLoraLoader(lora_config.lora_dir)

0 commit comments

Comments
 (0)