Skip to content
This repository has been archived by the owner on Jul 1, 2024. It is now read-only.

Commit

Permalink
Log when we are using gradient accumulation (#647)
Browse files Browse the repository at this point in the history
Summary: Pull Request resolved: #647

Reviewed By: vreis

Differential Revision: D24840948 (3c012ba)

fbshipit-source-id: 6e6837cd5b742ae185a9ae7959031865d7d237e3
  • Loading branch information
mannatsingh authored and facebook-github-bot committed Nov 12, 2020
1 parent 1ba6b03 commit 49a3762
Showing 1 changed file with 5 additions and 1 deletion.
6 changes: 5 additions & 1 deletion classy_vision/tasks/classification_task.py
Original file line number Diff line number Diff line change
Expand Up @@ -717,7 +717,7 @@ def prepare(self):
if self.simulated_global_batchsize is not None:
if self.simulated_global_batchsize % self.get_global_batchsize() != 0:
raise ValueError(
f"Global batch size ({self.get_global_batchsize()}) must divide"
f"Global batch size ({self.get_global_batchsize()}) must divide "
f"simulated_global_batchsize ({self.simulated_global_batchsize})"
)
else:
Expand All @@ -726,6 +726,10 @@ def prepare(self):
self.optimizer_period = (
self.simulated_global_batchsize // self.get_global_batchsize()
)
if self.optimizer_period > 1:
logging.info(
f"Using gradient accumulation with a period of {self.optimizer_period}"
)

if self.checkpoint_path:
self.checkpoint_dict = load_and_broadcast_checkpoint(self.checkpoint_path)
Expand Down

0 comments on commit 49a3762

Please sign in to comment.