InternEvo is an open-sourced lightweight training framework aims to support model pre-training without the need for extensive dependencies.
-
Updated
Dec 18, 2024 - Python
InternEvo is an open-sourced lightweight training framework aims to support model pre-training without the need for extensive dependencies.
Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*
Fast and easy distributed model training examples.
Democratizing huggingface model training with InternEvo
Add a description, image, and links to the sequence-parallelism topic page so that developers can more easily learn about it.
To associate your repository with the sequence-parallelism topic, visit your repo's landing page and select "manage topics."