Skip to content

Commit

Permalink
change to kwargs
Browse files Browse the repository at this point in the history
  • Loading branch information
SunMarc committed Sep 29, 2023
1 parent 498c4b8 commit d0f27d2
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion optimum/bettertransformer/models/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -583,7 +583,7 @@ def llama_forward(
past_key_value: Optional[Tuple[torch.Tensor]] = None,
output_attentions: bool = False,
use_cache: bool = False,
padding_mask: Optional[torch.LongTensor] = None,
**kwargs,
) -> Tuple[torch.Tensor, Optional[torch.Tensor], Optional[Tuple[torch.Tensor]]]:
if output_attentions is True:
raise ValueError("output_attentions=True can not be supported with BetterTransformer.")
Expand Down

0 comments on commit d0f27d2

Please sign in to comment.