Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

freeze official configs for reproductions #421

Merged
merged 8 commits into from
Feb 5, 2024
Merged

Conversation

epwalsh
Copy link
Member

@epwalsh epwalsh commented Feb 1, 2024

Creates official copies of the configs we used to train the released models, with sensible names.

Still TODO:

  • Save 1B config to configs/official/OLMo-1B.yaml. @soldni
  • Update paths in those configs to point to data on R2 instead of S3, once that's copied over. @2015aroras
  • Update README with pointers to these configs and examples on how to use them. @epwalsh

@epwalsh epwalsh linked an issue Feb 1, 2024 that may be closed by this pull request
@soldni
Copy link
Member

soldni commented Feb 2, 2024

Hey @epwalsh, added the 1B config and set the correct EOS token on both 1B and 7B. Didn't touch any data paths, lmk how you'd to handle it.

@epwalsh
Copy link
Member Author

epwalsh commented Feb 2, 2024

Thanks @soldni, I'll update paths in both configs once the data is public on R2.

@epwalsh epwalsh merged commit 15af668 into main Feb 5, 2024
10 checks passed
@epwalsh epwalsh deleted the epwalsh/official-configs branch February 5, 2024 23:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Training code
2 participants