Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

replace create_seed_checkpoint.md with a note in docs/checkpoint.md #736

Merged
merged 1 commit into from
Dec 16, 2024

Conversation

tianyu-l
Copy link
Contributor

@tianyu-l tianyu-l commented Dec 13, 2024

Stack from ghstack (oldest at bottom):

This PR does the following

  1. remove create_seed_checkpoint.sh as it's basically run_llama_train.sh with some config, so we still keep the useful functionality
  2. fix the error in setting DTensor random seeds when creating a seed checkpoint by checking if spmd_mesh.get_coordinate() is not None
  3. add documentation in docs/checkpoint.md on how to generate one
  4. remove its use in CI as PP can work without one

tianyu-l added a commit that referenced this pull request Dec 13, 2024
ghstack-source-id: 6b70ca7604d6701fac0e34d623826d91922a0424
Pull Request resolved: #736
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Meta Open Source bot. label Dec 13, 2024
@tianyu-l tianyu-l merged commit 86c8e55 into gh/tianyu-l/27/base Dec 16, 2024
4 of 5 checks passed
@tianyu-l tianyu-l deleted the gh/tianyu-l/27/head branch December 16, 2024 03:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants