Skip to content

Commit

Permalink
Fix side effect brought by supporting codellama: sequence_start is …
Browse files Browse the repository at this point in the history
…always true when calling `model.get_prompt` (#466)
  • Loading branch information
lvhan028 authored Sep 25, 2023
1 parent 7194500 commit e980377
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion lmdeploy/turbomind/chat.py
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ def main(model_path,
step = 0
seed = random.getrandbits(64)
else:
prompt = model.get_prompt(prompt, nth_round)
prompt = model.get_prompt(prompt, nth_round == 1)
input_ids = tokenizer.encode(prompt)
if step + len(input_ids) >= tm_model.session_len:
print('WARNING: exceed session max length.'
Expand Down

0 comments on commit e980377

Please sign in to comment.