Skip to content

Commit

Permalink
store the ema model in f16 to save on space for post hoc ema
Browse files Browse the repository at this point in the history
  • Loading branch information
lucidrains committed Feb 8, 2024
1 parent 8ad8913 commit c77058c
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion ema_pytorch/post_hoc_ema.py
Original file line number Diff line number Diff line change
Expand Up @@ -320,7 +320,7 @@ def checkpoint(self):
filename = f'{ind}.{step}.pt'
path = self.checkpoint_folder / filename

pkg = ema_model.state_dict()
pkg = deepcopy(ema_model).half().state_dict()
torch.save(pkg, str(path))

@beartype
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
setup(
name = 'ema-pytorch',
packages = find_packages(exclude=[]),
version = '0.4.1',
version = '0.4.2',
license='MIT',
description = 'Easy way to keep track of exponential moving average version of your pytorch module',
author = 'Phil Wang',
Expand Down

0 comments on commit c77058c

Please sign in to comment.