Skip to content

Commit

Permalink
Bug fix: printing non-distributed data (#1756)
Browse files Browse the repository at this point in the history
* make 1-proc print great again

* fix tabs size

* skip formatter on non-distr data

* remove time import

(cherry picked from commit 3082dd9)
  • Loading branch information
ClaudiaComito authored and github-actions[bot] committed Jan 21, 2025
1 parent f75f1d3 commit 3303a5c
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion heat/core/printing.py
Original file line number Diff line number Diff line change
Expand Up @@ -303,6 +303,9 @@ def _tensor_str(dndarray, indent: int) -> str:
# to do so, we slice up the torch data and forward it to torch internal printing mechanism
summarize = elements > get_printoptions()["threshold"]
torch_data = _torch_data(dndarray, summarize)
if not dndarray.is_distributed():
# let torch handle formatting on non-distributed data
# formatter gets too slow for even moderately large tensors
return torch._tensor_str._tensor_str(torch_data, indent)
formatter = torch._tensor_str._Formatter(torch_data)

return torch._tensor_str._tensor_str_with_formatter(torch_data, indent, summarize, formatter)

0 comments on commit 3303a5c

Please sign in to comment.