Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update example usage to disable sequence_parallel #11225

Closed
wants to merge 1 commit into from

Conversation

eagle705
Copy link
Contributor

@eagle705 eagle705 commented Nov 8, 2024

What does this PR do ?

Update example usage about sequence_parallel option to resolve an error below

Traceback (most recent call last):
  File "/opt/NeMo/examples/nlp/language_modeling/megatron_gpt_prune.py", line 93, in main
    model = MegatronGPTModel.restore_from(
  File "/opt/NeMo/nemo/collections/nlp/models/nlp_model.py", line 478, in restore_from
    return super().restore_from(
  File "/opt/NeMo/nemo/core/classes/modelPT.py", line 468, in restore_from
    instance = cls._save_restore_connector.restore_from(
  File "/opt/NeMo/nemo/collections/nlp/parts/nlp_overrides.py", line 1298, in restore_from
    loaded_params = super().load_config_and_state_dict(
  File "/opt/NeMo/nemo/core/connectors/save_restore_connector.py", line 182, in load_config_and_state_dict
    instance = calling_cls.from_config_dict(config=conf, trainer=trainer)
  File "/opt/NeMo/nemo/core/classes/common.py", line 530, in from_config_dict
    raise e
  File "/opt/NeMo/nemo/core/classes/common.py", line 522, in from_config_dict
    instance = cls(cfg=config, trainer=trainer)
  File "/opt/NeMo/nemo/collections/nlp/models/language_modeling/megatron_gpt_model.py", line 331, in __init__
    super().__init__(cfg, trainer=trainer, no_lm_init=True)
  File "/opt/NeMo/nemo/collections/nlp/models/language_modeling/megatron_base_model.py", line 146, in __init__
    self.model_parallel_config: ModelParallelConfig = self.build_model_parallel_config()
  File "/opt/NeMo/nemo/collections/nlp/models/language_modeling/megatron_base_model.py", line 1194, in build_model_parallel_config
    model_parallel_config = ModelParallelConfig(**mp_config_dict)
  File "<string>", line 58, in __init__
  File "/opt/megatron-lm/megatron/core/model_parallel_config.py", line 313, in __post_init__
    raise ValueError("Can not use sequence paralllelism without tensor parallelism")
ValueError: Can not use sequence paralllelism without tensor parallelism

Collection: [Note which collection this PR will affect]

Changelog

Usage

python examples/nlp/language_modeling/megatron_gpt_prune.py \
    model.restore_from_path=llama3.1-8b-instruct.nemo \
    model.tensor_model_parallel_size=1 \
    model.pipeline_model_parallel_size=8 \
    +model.sequence_parallel=false \ <-- add this line
    trainer.num_nodes=1 \
    trainer.precision=bf16 \
    trainer.devices=8 \
    prune.ffn_hidden_size=9216 \
    prune.num_attention_heads=null \
    prune.num_query_groups=null \
    prune.hidden_size=3072 \
    export.save_path=llama3.1-8b-instruct-pruned.nemo

GitHub Actions CI

The Jenkins CI system has been replaced by GitHub Actions self-hosted runners.

The GitHub Actions CI will run automatically when the "Run CICD" label is added to the PR.
To re-run CI remove and add the label again.
To run CI on an untrusted fork, a NeMo user with write access must first click "Approve and run".

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
    • Reviewer: Does the PR have correct import guards for all optional libraries?

PR Type:

  • New Feature
  • Bugfix
  • Documentation

If you haven't finished some of the above items you can still open "Draft" PR.

Who can review?

Anyone in the community is free to review the PR once the checks have passed.
Contributor guidelines contains specific people who can review PRs to various areas.

Additional Information

  • Related to # (issue)

@github-actions github-actions bot added the NLP label Nov 8, 2024
@kevalmorabia97
Copy link
Collaborator

kevalmorabia97 commented Nov 8, 2024

sequence_parallel: false is already added to the default config now so it should not be needed anymore if you use latest container. Can you please confirm?

https://github.com/eagle705/NeMo/blob/main/examples/nlp/language_modeling/conf/megatron_gpt_prune.yaml#L26

@eagle705 eagle705 closed this by deleting the head repository Nov 8, 2024
@eagle705
Copy link
Contributor Author

eagle705 commented Nov 8, 2024

@kevalmorabia97
Right. It already added as you mentioned. This issue should be resolved. Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants