Question about splitting batches in 'dp' mode #5802
Unanswered
xiaopi-ouo
asked this question in
DDP / multi-GPU / multi-node
Replies: 2 comments
-
I don't think this is the same issue as you linked. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Thank you:) I finally solved it by padding manually |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello, I faced the same problem as mentioned as #1508.
Here the the
samples
is a list of tensor with different size.It's unavoidable in my case. (each sample actually has different number of parts to be predicted)
So, I can't solved it by just converting them into a tensor
torch.tensor([sample['labels'] for sample in samples])
Is there other solution?
Thanks.
Beta Was this translation helpful? Give feedback.
All reactions