Key frameworks:
uv sync
optimizers
: some popular optimizers (e.g.schedulefree
)quant
: quantization tools (e.g.bitsandbytes
)
To use all:
uv sync --extra optimizers --extra quant
Distributed training is not tested yet.
Simplest example:
fabric run \
./main.py \
--config ./configs/mnist.yaml
With options:
fabric run \
--accelerator cuda \
--precision bf16-mixed \
./main.py \
--config ./configs/mnist.yaml