Skip to content

Add support for sharded models when TorchAO quantization is enabled #12204

Add support for sharded models when TorchAO quantization is enabled

Add support for sharded models when TorchAO quantization is enabled #12204

Triggered via pull request December 20, 2024 00:47
Status Success
Total duration 28s
Artifacts

pr_flax_dependency_test.yml

on: pull_request
check_flax_dependencies
20s
check_flax_dependencies
Fit to window
Zoom out
Zoom in