-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
llama factory llama2-13b and mistral-7b pipeline (#93)
* added llama-factory under llm_rl * added sft training bash * added datasets from llama-factory; will delete later * finished llama-2-13b train and inference * fixed minor errors * changed config * added deepspeed config * added more training config to train bash * adding fix for wandb tags and distributed ranks * added fastchat data to replicate training for 2k * tyring to replicate fastchat as close as possible * before merging * changed finetue scripts for better performance * added new data * example bash * example bash for mistral
- Loading branch information
1 parent
eb68a69
commit 07ee480
Showing
11 changed files
with
52 additions
and
27 deletions.
There are no files selected for viewing
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,46 +1,47 @@ | ||
deepspeed src/train_bash.py \ | ||
--stage sft \ | ||
--model_name_or_path meta-llama/Llama-2-13b-hf \ | ||
--dataset sotopia_easy_sft \ | ||
--dataset fastchat-sft \ | ||
--dataset_dir ./data/ \ | ||
--val_size 0.1 \ | ||
--cutoff_len 4096 \ | ||
--template llama2-sotopia \ | ||
--wandb_project "llama-factory-sft" \ | ||
--wandb_tags "['llama-2-13b-hf']" \ | ||
--use_fast_tokenizer False \ | ||
--do_train \ | ||
--num_train_epochs 15.0 \ | ||
--per_device_train_batch_size 8 \ | ||
--gradient_accumulation_steps 8 \ | ||
--per_device_train_batch_size 1 \ | ||
--gradient_accumulation_steps 32 \ | ||
--finetuning_type lora \ | ||
--lora_target q_proj,v_proj \ | ||
--lora_rank 8 \ | ||
--lora_alpha 16 \ | ||
--lora_dropout 0.05 \ | ||
--qlora_compute_dtype bf16 \ | ||
--learning_rate 5e-5 \ | ||
--lr_scheduler_type cosine \ | ||
--weight_decay 0. \ | ||
--warmup_ratio 0.03 \ | ||
--quantization_bit 4 \ | ||
--quantization_type nf4 \ | ||
--double_quantization \ | ||
--double_quantization True \ | ||
--flash_attn True \ | ||
--gradient_checkpointing True \ | ||
--bf16 \ | ||
--bf16 True \ | ||
--cache_dir ./model_cache \ | ||
--overwrite_cache \ | ||
--output_dir ./llama2-13b-sft_cache \ | ||
--overwrite_output_dir \ | ||
--logging_steps 1 \ | ||
--evaluation_strategy "steps" \ | ||
--per_device_eval_batch_size 32 \ | ||
--eval_accumulation_steps 32 \ | ||
--save_strategy "epoch" \ | ||
--save_total_limit 5 \ | ||
--use_auth_token True \ | ||
--wandb_token "99caa13ec9552adf0e92e5c30021307ce3cf7fa4" \ | ||
--hf_auth_token "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" \ | ||
--deepspeed ./deepspeed_config_s2.json | ||
|
||
# --dataset alpaca_gpt4_en \ | ||
# --dataset alpaca_gpt4_en \ | ||
# --val_size 0.1 \ | ||
# --evaluation_strategy "steps" \ | ||
# --per_device_eval_batch_size 32 \ | ||
# --eval_accumulation_steps 32 \ | ||
# --lora_rank 8 \ | ||
# --lora_alpha 16 \ | ||
# --lora_dropout 0.05 \ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters