You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
So I cloned the latest version of transformers in order to try and use the new mistral 7B model, that model didn't work for me so I went back to the model I was using before that was using transformers and BetterTransformer, but I've now got an issue with my BetterTransformer code. So I figured maybe there was a compatibility issue I tried upgrading optimum, and when that didn't work, I cloned the latest version of optimum, but I'm still faced with the same issue.
Traceback (most recent call last):
File "/usr/local/llamaengineer.py", line 553, in <module>
generated_text = generate(prompt)
File "/usr/local/llamaengineer.py", line 543, in generate
generate_ids = bt_model.generate(inputs.input_ids.to("cuda"),do_sample=True, max_length=4096, temperature=0.7)
File "/usr/local/lib/python3.10/dist-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/generation/utils.py", line 1652, in generate
return self.sample(
File "/usr/local/lib/python3.10/dist-packages/transformers/generation/utils.py", line 2734, in sample
outputs = self(
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/accelerate/hooks.py", line 165, in new_forward
output = old_forward(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/models/llama/modeling_llama.py", line 1038, in forward
outputs = self.model(
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/models/llama/modeling_llama.py", line 925, in forward
layer_outputs = decoder_layer(
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/models/llama/modeling_llama.py", line 635, in forward
hidden_states, self_attn_weights, present_key_value = self.self_attn(
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/optimum/bettertransformer/models/decoder_models.py", line 426, in forward
return llama_forward(self, *args, **kwargs)
TypeError: llama_forward() got an unexpected keyword argument 'padding_mask'
I'm not terribly certain what the problem is because I was using this script and it was working a couple of days ago. Any help would greatly be appreciated.
The text was updated successfully, but these errors were encountered:
Hi, apologize for the breaking change. The issue is fixed in #1421, feel free to use the latest commit for now until we do a release! Alternatively, you should not face this issue with transformers 4.33.3.
So I cloned the latest version of transformers in order to try and use the new mistral 7B model, that model didn't work for me so I went back to the model I was using before that was using transformers and BetterTransformer, but I've now got an issue with my BetterTransformer code. So I figured maybe there was a compatibility issue I tried upgrading optimum, and when that didn't work, I cloned the latest version of optimum, but I'm still faced with the same issue.
I'm not terribly certain what the problem is because I was using this script and it was working a couple of days ago. Any help would greatly be appreciated.
The text was updated successfully, but these errors were encountered: