[FutureWarning] Fixing warning triggered by torch.cuda.reset_max_memory_allocated()
usage.
#14866
Job | Run time |
---|---|
3s | |
2s | |
5s |
torch.cuda.reset_max_memory_allocated()
usage.
#14866
Job | Run time |
---|---|
3s | |
2s | |
5s |