-
Notifications
You must be signed in to change notification settings - Fork 2.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Avoid rewrapping modules with DDP/FSDP if already wrapped #12096
base: main
Are you sure you want to change the base?
Conversation
Signed-off-by: Ananth Subramaniam <[email protected]>
Signed-off-by: Ananth Subramaniam <[email protected]>
Signed-off-by: Ananth Subramaniam <[email protected]>
@ananthsub The changes in 32cb34a plus an else statement to avoid unbound issue for if HAVE_CUSTOM_FSDP and self.ddp_config.use_custom_fsdp and not isinstance(unwrapped_module, FullyShardedDataParallel):
....
elif not isinstance(unwrapped_module, DDP):
....
else:
dist_module = unwrapped_module |
# Avoid rewrapping the module if it's already of type Float16Module | ||
if hasattr(module, 'module') and not isinstance(module.module, Float16Module): | ||
module.module = Float16Module(config, module.module) | ||
else: | ||
elif not isinstance(module, Float16Module): | ||
module = Float16Module(config, module) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need these changes to be applied to MegatronMixedPrecision
as well:
NeMo/nemo/lightning/pytorch/plugins/mixed_precision.py
Lines 154 to 157 in d23e1d6
if hasattr(module, 'module'): | |
module.module = Float16Module(config, module.module) | |
else: | |
module = Float16Module(config, module) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
however, even with this change we are wrapping with Float16Module
again. The hierarchy of modules printed just before the convert_module()
method in MegatronMixedPrecision
:
1st run
BionemoLightningModule(
(module): ESM2FineTuneSeqModel(
2nd run
BionemoLightningModule(
(module): DDP(
(module): Float16Module(
(module): ESM2FineTuneSeqModel(
What does this PR do ?
In case of multiple
trainer.fit
calls for the same LightningModule, it's possible the underlying module was wrapped with DDP/FSDP multiple times. This adds a check to ensure that for such cases, the module is wrapped only once.Add a one line overview of what this PR aims to accomplish.
Collection: [Note which collection this PR will affect]
Changelog
Usage
# Add a code snippet demonstrating how to use this
GitHub Actions CI
The Jenkins CI system has been replaced by GitHub Actions self-hosted runners.
The GitHub Actions CI will run automatically when the "Run CICD" label is added to the PR.
To re-run CI remove and add the label again.
To run CI on an untrusted fork, a NeMo user with write access must first click "Approve and run".
Before your PR is "Ready for review"
Pre checks:
PR Type:
If you haven't finished some of the above items you can still open "Draft" PR.
Who can review?
Anyone in the community is free to review the PR once the checks have passed.
Contributor guidelines contains specific people who can review PRs to various areas.
Additional Information