Skip to content

Commit

Permalink
Ignore potential junk tensors in LoRA
Browse files Browse the repository at this point in the history
  • Loading branch information
turboderp committed Jul 23, 2024
1 parent 81cd6b7 commit 46a803f
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions exllamav2/lora.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,8 @@ def __init__(self,
f = load_file(self.lora_path, map_location = "cpu")

for key in f.keys():
if any(key.endswith(x) for x in [".original_module.weight", ".modules_to_save.weight"]):
continue
tensor = f[key]

# Find target
Expand Down

0 comments on commit 46a803f

Please sign in to comment.