We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
This issue to be done after #99
the kernel-hyperdrive has support for LoRA adapters, but need to install it properly into the plugin
kernel-hyperdrive
prepare
model_loader
augmentation
peft_config
fsdp_save
load
In addition we defer the following things
--bf16
--fp16
(stretch) Furthemore, we could also improve the lora triton kernels to support lora_r < 16
lora_r < 16
The text was updated successfully, but these errors were encountered:
No branches or pull requests
This issue to be done after #99
the
kernel-hyperdrive
has support for LoRA adapters, but need to install it properly into the pluginprepare
happen during augmentation (still use themodel_loader
). inaugmentation
we intercept thepeft_config
fsdp_save
andload
functions to support adapters.In addition we defer the following things
--bf16
or--fp16
flags easily.(stretch) Furthemore, we could also improve the lora triton kernels to support
lora_r < 16
The text was updated successfully, but these errors were encountered: