Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ScatterMoE to support LoRA Adapters #103

Open
fabianlim opened this issue Nov 6, 2024 · 0 comments
Open

ScatterMoE to support LoRA Adapters #103

fabianlim opened this issue Nov 6, 2024 · 0 comments
Labels
help wanted Extra attention is needed question Further information is requested

Comments

@fabianlim
Copy link
Contributor

fabianlim commented Nov 6, 2024

This issue to be done after #99

the kernel-hyperdrive has support for LoRA adapters, but need to install it properly into the plugin

  • perhaps have the prepare happen during augmentation (still use the model_loader). in augmentation we intercept the peft_config
  • ensure that checkpoint saving works correctly with LoRA. Maybe its easier to just modify the fsdp_save and load functions to support adapters.

In addition we defer the following things

  • handle the mixed precision here, as we can now get the --bf16 or --fp16 flags easily.

(stretch) Furthemore, we could also improve the lora triton kernels to support lora_r < 16

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed question Further information is requested
Projects
None yet
Development

No branches or pull requests

1 participant