You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When loading a model that does not have add generation prompt in the chat template, it causes a runtime error rather than just a warning. This means - even if one does not want to have a generation prompt - one is forced to do so.
Specific error:
Unsloth: The tokenizer `Llama-3.2-1B-lora-model`
has a {% if add_generation_prompt %} for generation purposes, but wasn't provided correctly.
Recommendations:
Make this a warning not a runtime error that stops loading.
Provide an example of the exact syntax that unsloth expects.
The text was updated successfully, but these errors were encountered:
Problem:
Specific error:
Recommendations:
The text was updated successfully, but these errors were encountered: