Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix cache position issue in mixtral #2

Closed
wants to merge 1 commit into from
Closed

Conversation

tthakkal
Copy link
Owner

Fix the crash

File "/usr/local/lib/python3.10/dist-packages/optimum/habana/transformers/models/mixtral/modeling_mixtral.py", line 613, in forward
    past_seen_tokens = past_key_values.get_seq_length() if past_key_values is not None else 0
AttributeError: 'list' object has no attribute 'get_seq_length'

What does this PR do?

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

Fix the crash 

```
File "/usr/local/lib/python3.10/dist-packages/optimum/habana/transformers/models/mixtral/modeling_mixtral.py", line 613, in forward
    past_seen_tokens = past_key_values.get_seq_length() if past_key_values is not None else 0
AttributeError: 'list' object has no attribute 'get_seq_length'

```
@tthakkal tthakkal closed this Aug 19, 2024
@tthakkal tthakkal deleted the mixtral_cache_issue branch August 21, 2024 01:53
@tthakkal tthakkal restored the mixtral_cache_issue branch August 21, 2024 01:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant