We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Yes, I will share a minimal reproducible script.
[rank104]: File "/opt/conda/lib/python3.8/site-packages/colossalai/shardformer/modeling/deepseek_v3.py", line 81, in forward [rank104]: y = self.moe_forward(hidden_states, topk_idx, topk_weight).view(*orig_shape) [rank104]: File "/opt/conda/lib/python3.8/site-packages/colossalai/shardformer/modeling/deepseek_v3.py", line 100, in moe_forward [rank104]: gathered_tokens, _ = all_to_all_uneven(sorted_tokens, input_split_sizes, output_splits, self.ep_group) [rank104]: File "/opt/conda/lib/python3.8/site-packages/colossalai/moe/_operation.py", line 452, in all_to_all_uneven [rank104]: return AllToAllUneven.apply(inputs, input_split_sizes, output_split_sizes, group, overlap, fp8_communication) [rank104]: File "/opt/conda/lib/python3.8/site-packages/torch/autograd/function.py", line 574, in apply [rank104]: return super().apply(*args, **kwargs) # type: ignore[misc] [rank104]: File "/opt/conda/lib/python3.8/site-packages/colossalai/moe/_operation.py", line 428, in forward [rank104]: return _all_to_all( [rank104]: File "/opt/conda/lib/python3.8/site-packages/colossalai/moe/_operation.py", line 395, in _all_to_all [rank104]: outputs = torch.empty(outputs_shape, dtype=inputs.dtype, device=inputs.device) [rank104]: RuntimeError: Trying to create tensor with negative dimension -2058873370790320781: [-2058873370790320781, 7168]
No response
The text was updated successfully, but these errors were encountered:
Sorry, something went wrong.
Overflow while accumulating output_split_sizes
No branches or pull requests
Is there an existing issue for this bug?
The bug has not been fixed in the latest main branch
Do you feel comfortable sharing a concise (minimal) script that reproduces the error? :)
Yes, I will share a minimal reproducible script.
🐛 Describe the bug
[rank104]: File "/opt/conda/lib/python3.8/site-packages/colossalai/shardformer/modeling/deepseek_v3.py", line 81, in forward
[rank104]: y = self.moe_forward(hidden_states, topk_idx, topk_weight).view(*orig_shape)
[rank104]: File "/opt/conda/lib/python3.8/site-packages/colossalai/shardformer/modeling/deepseek_v3.py", line 100, in moe_forward
[rank104]: gathered_tokens, _ = all_to_all_uneven(sorted_tokens, input_split_sizes, output_splits, self.ep_group)
[rank104]: File "/opt/conda/lib/python3.8/site-packages/colossalai/moe/_operation.py", line 452, in all_to_all_uneven
[rank104]: return AllToAllUneven.apply(inputs, input_split_sizes, output_split_sizes, group, overlap, fp8_communication)
[rank104]: File "/opt/conda/lib/python3.8/site-packages/torch/autograd/function.py", line 574, in apply
[rank104]: return super().apply(*args, **kwargs) # type: ignore[misc]
[rank104]: File "/opt/conda/lib/python3.8/site-packages/colossalai/moe/_operation.py", line 428, in forward
[rank104]: return _all_to_all(
[rank104]: File "/opt/conda/lib/python3.8/site-packages/colossalai/moe/_operation.py", line 395, in _all_to_all
[rank104]: outputs = torch.empty(outputs_shape, dtype=inputs.dtype, device=inputs.device)
[rank104]: RuntimeError: Trying to create tensor with negative dimension -2058873370790320781: [-2058873370790320781, 7168]
Environment
No response
The text was updated successfully, but these errors were encountered: