Skip to content

Commit

Permalink
Fix generation not concluding when eos was hit for each prompt
Browse files Browse the repository at this point in the history
  • Loading branch information
cornzz authored Aug 21, 2024
1 parent 337e3db commit f93fd8e
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/mistral_inference/generate.py
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,7 @@ def generate(
next_token = sample(last_token_prelogits, temperature=temperature, top_p=0.8)

if eos_id is not None:
is_finished = is_finished ^ (next_token == eos_id).cpu()
is_finished = is_finished | (next_token == eos_id).cpu()

if is_finished.all():
break
Expand Down

0 comments on commit f93fd8e

Please sign in to comment.