Skip to content

Add support for sharded models when TorchAO quantization is enabled #27078

Add support for sharded models when TorchAO quantization is enabled

Add support for sharded models when TorchAO quantization is enabled #27078

Annotations

1 warning

Fast Flax CPU tests

succeeded Dec 20, 2024 in 1m 57s