You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Some of your uploaded huggingface models lack the parameter rope_scaling in the config. If we don't have rope_scaling, model will generate " " " " " ".
Some of your uploaded huggingface models lack the parameter
rope_scaling
in the config. If we don't haverope_scaling
, model will generate" " " " " "
."rope_scaling": {"factor": 2.0, "type": "linear"}
in Llama-2-7b-longlora-8k-ft"rope_scaling": {"factor": 4.0, "type": "linear"}
in Llama-2-7b-longlora-16k-ft"rope_scaling": {"factor": 8.0, "type": "linear"}
in Llama-2-7b-longlora-32k-ft"rope_scaling": {"factor": 25.0, "type": "linear"}
in Llama-2-7b-longlora-100k-ft"rope_scaling": {"factor": 2.0, "type": "linear"}
in Llama-2-13b-longlora-8k-ft"rope_scaling": {"factor": 4.0, "type": "linear"}
in Llama-2-13b-longlora-16k-ft"rope_scaling": {"factor": 2.0, "type": "linear"}
in Llama-2-7b-longlora-8k"rope_scaling": {"factor": 4.0, "type": "linear"}
in Llama-2-7b-longlora-16k"rope_scaling": {"factor": 8.0, "type": "linear"}
in Llama-2-7b-longlora-32k"rope_scaling": {"factor": 2.0, "type": "linear"}
in Llama-2-13b-longlora-8k"rope_scaling": {"factor": 4.0, "type": "linear"}
in Llama-2-13b-longlora-16k"rope_scaling": {"factor": 8.0 ,"type": "linear"}
in Llama-2-13b-longlora-32k"rope_scaling": {"factor": 16.0, "type": "linear"}
in Llama-2-13b-longlora-64k"rope_scaling": {"factor": 8.0, "type": "linear"}
in Llama-2-70b-longlora-32k"rope_scaling": {"factor": 8.0, "type": "linear"}
in Llama-2-70b-chat-longlora-32kThe text was updated successfully, but these errors were encountered: