Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/load balance add expert replacement feature for MoE model(mixtral) #187

Open
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

uygnef
Copy link

@uygnef uygnef commented Apr 18, 2024

I have added a new feature to the Megatron LM repository that introduces a load balance interval for expert replacement in Mixture of Experts (MoE) models. This feature allows for the redistribution of experts across GPUs at user-specified intervals, with the aim of achieving a balanced computational load across the GPUs by maintaining a similar number of tokens processed on each card.

Implementation Details
The load balance interval for expert replacement is controlled by a new command-line argument --load-balance-interval. Users can specify the number of steps after which the redistribution of experts should take place. The system then automatically adjusts the placement of experts to ensure an even workload distribution, improving the overall efficiency of the MoE model training.

Benefits
Parallel strategy:tp4pp2ep2, 16 GPUs, train from scratch and without aux loss

strategy tokens/gpu/sec
without load balance 1020
with load balance 1131(+11%)

How to Use
To enable the load balance interval for expert replacement, users should use the --load-balance-interval argument.

@CLAassistant
Copy link

CLAassistant commented Apr 18, 2024

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
1 out of 2 committers have signed the CLA.

✅ uygnef
❌ fengyu05


fengyu05 seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.
You have signed the CLA already but the status is still pending? Let us recheck it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants