Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add IREE numerics test for Llama 3.1 8B FP16 TP8 #394

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

sogartar
Copy link
Contributor

Introduce a Llama 3.1 8B FP16 TP8 test that appears to not have good numerical accuracy. It is compared to an fp64 unsharded torch variant to ensure that the reference is of high accuracy.

Refactor the sharded Llama tests. Increase code reuse and use the TorchGenerator in the toy-sized tests. Use the shard_llm_dataset and export_paged_llm_v1 scripts in the test flow to increase their test coverage.

Introduce a Llama 3.1 8B FP16 TP8 test that appears to not have good
numerical accuracy. It is compared to an fp64 unsharded torch variant
to ensure that the reference is of high accuracy.

Refactor the sharded Llama tests. Increase code reuse and use the
TorchGenerator in the toy-sized tests. Use the shard_llm_dataset and
export_paged_llm_v1 scripts in the test flow to increase their test
coverage.
@sogartar sogartar requested review from rsuderman and IanNod October 31, 2024 11:30
@sogartar
Copy link
Contributor Author

This PR depends on #383, #384, #386, #390, #391, #392, #393.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant