Skip to content

Commit

Permalink
Merge pull request #4782 from FederatedAI/develop-1.11.0
Browse files Browse the repository at this point in the history
fix RELEASE doc
  • Loading branch information
dylan-fan committed Apr 13, 2023
2 parents 86a7b65 + b03e89e commit 5ac0567
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion RELEASE.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
## Release 1.11.0
### Major Features and Improvements
> FederatedML
* Support FedLLM (Federated Large Language Models)
* Support FATE-LLM (Federated Large Language Models)
* Integration of LLM for federated learning: BERT, ALBERT, RoBERTa, GPT-2, BART, DeBERTa, and DistilBERT. Please note that if using such pretrain-models, compliance with their licenses is needed.
* Integration of Parameter-efficient tuning methods for federated learning: Bottleneck Adapters (including Houlsby, Pfeiffer, Parallel schemes), Invertible Adapters, LoRA, IA3, and Compacter.
* Improved Homo Federated Trainer class, allowing CUDA device specification and DataParallel acceleration for multi-GPU devices.
Expand Down

0 comments on commit 5ac0567

Please sign in to comment.