Skip to content
This repository has been archived by the owner on Feb 1, 2024. It is now read-only.

Lightning Training strategy for Horovod

License

Notifications You must be signed in to change notification settings

Lightning-Universe/lightning-Horovod

Repository files navigation

Lightning extension: Horovod

lightning PyPI Status PyPI - Python Version PyPI - Downloads Deploy Docs

General checks CI testing Build Status pre-commit.ci status

Horovod allows the same training script for single-GPU, multi-GPU, and multi-node training.

Like Distributed Data-Parallel, Horovod's processes operate on a single GPU with a fixed subset of the data. Gradients are averaged across all GPUs in parallel during the backward pass, then synchronously applied before beginning the next step.

The number of worker processes is configured by a driver application (horovodrun or mpirun). Horovod will detect the number of workers from the environment in the training script and automatically scale the learning rate to compensate for the increased total batch size.

Horovod can be configured in the training script to run with any number of GPUs / processes as follows:

from lightning import Trainer
from lightning_horovod import HorovodStrategy

# train Horovod on GPU (number of GPUs / machines provided on command-line)
trainer = Trainer(strategy="horovod", accelerator="gpu", devices=1)

# train Horovod on CPU (number of processes/machines provided on command-line)
trainer = Trainer(strategy=HorovodStrategy())

When starting the training job, the driver application will then be used to specify the total number of worker processes:

# run training with 4 GPUs on a single machine
horovodrun -np 4 python train.py

# run training with 8 GPUs on two machines (4 GPUs each)
horovodrun -np 8 -H hostname1:4,hostname2:4 python train.py

See the official Horovod documentation for installation and performance tuning details.

About

Lightning Training strategy for Horovod

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •