-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
easytorch-training - ERROR RuntimeError: Expected 2D (unbatched) or 3D (batched) input to conv1d, but got input of size: [8, 32, 207, 13] #26
Comments
这个是torchc版本的问题,推荐使用1.10.0或者1.9.1,这俩我都测试过。更高版本的torch似乎不适配某些算子,就会导致出现这个问题。 |
) |
似乎已经顺利运行。 |
After torch 1.10, the kernel size of The fix will be changing |
请问除了降版本没有其他方法了吗 |
感谢老哥之前的回复,我现在重新调整了环境,但是还是遇到了一些问题,不知道之前您这边遇到过吗?
`2022-12-07 13:27:34,613 - easytorch-training - INFO - Epoch 1 / 100
0%| | 0/2997 [00:08<?, ?it/s]
2022-12-07 13:27:42,909 - easytorch-training - ERROR - Traceback (most recent call last):
File "D:\python-venv\STEP\venv2\lib\site-packages\easytorch\launcher\launcher.py", line 52, in training_func
runner.train(cfg)
File "D:\python-venv\STEP\venv2\lib\site-packages\easytorch\core\runner.py", line 351, in train
loss = self.train_iters(epoch, iter_index, data)
File "D:\python-venv\STEP\basicts\runners\base_tsf_runner.py", line 237, in train_iters
forward_return = list(self.forward(data=data, epoch=epoch, iter_num=iter_num, train=True))
File "D:\python-venv\STEP\step\step_runner\step_runner.py", line 66, in forward
prediction, pred_adj, prior_adj, gsl_coefficient = self.model(history_data=history_data, long_history_data=long_history_data, future_data=None, batch_seen=iter_num, epoch=epoch)
File "D:\python-venv\STEP\venv2\lib\site-packages\torch\nn\modules\module.py", line 1190, in _call_impl
return forward_call(*input, **kwargs)
File "D:\python-venv\STEP\step\step_arch\step.py", line 65, in forward
y_hat = self.backend(short_term_history, hidden_states=hidden_states, sampled_adj=sampled_adj).transpose(1, 2)
File "D:\python-venv\STEP\venv2\lib\site-packages\torch\nn\modules\module.py", line 1190, in _call_impl
return forward_call(*input, **kwargs)
File "D:\python-venv\STEP\step\step_arch\graphwavenet\model.py", line 197, in forward
gate = self.gate_convsi
File "D:\python-venv\STEP\venv2\lib\site-packages\torch\nn\modules\module.py", line 1190, in _call_impl
return forward_call(*input, **kwargs)
File "D:\python-venv\STEP\venv2\lib\site-packages\torch\nn\modules\conv.py", line 313, in forward
return self._conv_forward(input, self.weight, self.bias)
File "D:\python-venv\STEP\venv2\lib\site-packages\torch\nn\modules\conv.py", line 309, in _conv_forward
return F.conv1d(input, weight, bias, self.stride,
RuntimeError: Expected 2D (unbatched) or 3D (batched) input to conv1d, but got input of size: [8, 32, 207, 13]
Traceback (most recent call last):
File "D:\python-venv\STEP\step\run.py", line 27, in
launch_training(args.cfg, args.gpus)
File "D:\python-venv\STEP\basicts\launcher.py", line 19, in launch_training
easytorch.launch_training(cfg=cfg, gpus=gpus, node_rank=node_rank)
File "D:\python-venv\STEP\venv2\lib\site-packages\easytorch\launcher\launcher.py", line 93, in launch_training
train_dist(cfg)
File "D:\python-venv\STEP\venv2\lib\site-packages\easytorch\launcher\launcher.py", line 56, in training_func
raise e
File "D:\python-venv\STEP\venv2\lib\site-packages\easytorch\launcher\launcher.py", line 52, in training_func
runner.train(cfg)
File "D:\python-venv\STEP\venv2\lib\site-packages\easytorch\core\runner.py", line 351, in train
loss = self.train_iters(epoch, iter_index, data)
File "D:\python-venv\STEP\basicts\runners\base_tsf_runner.py", line 237, in train_iters
forward_return = list(self.forward(data=data, epoch=epoch, iter_num=iter_num, train=True))
File "D:\python-venv\STEP\step\step_runner\step_runner.py", line 66, in forward
prediction, pred_adj, prior_adj, gsl_coefficient = self.model(history_data=history_data, long_history_data=long_history_data, future_data=None, batch_seen=iter_num, epoch=epoch)
File "D:\python-venv\STEP\venv2\lib\site-packages\torch\nn\modules\module.py", line 1190, in _call_impl
return forward_call(*input, **kwargs)
File "D:\python-venv\STEP\step\step_arch\step.py", line 65, in forward
y_hat = self.backend(short_term_history, hidden_states=hidden_states, sampled_adj=sampled_adj).transpose(1, 2)
File "D:\python-venv\STEP\venv2\lib\site-packages\torch\nn\modules\module.py", line 1190, in _call_impl
return forward_call(*input, **kwargs)
File "D:\python-venv\STEP\step\step_arch\graphwavenet\model.py", line 197, in forward
gate = self.gate_convsi
File "D:\python-venv\STEP\venv2\lib\site-packages\torch\nn\modules\module.py", line 1190, in _call_impl
return forward_call(*input, **kwargs)
File "D:\python-venv\STEP\venv2\lib\site-packages\torch\nn\modules\conv.py", line 313, in forward
return self._conv_forward(input, self.weight, self.bias)
File "D:\python-venv\STEP\venv2\lib\site-packages\torch\nn\modules\conv.py", line 309, in _conv_forward
return F.conv1d(input, weight, bias, self.stride,
RuntimeError: Expected 2D (unbatched) or 3D (batched) input to conv1d, but got input of size: [8, 32, 207, 13]
(venv2) PS D:\python-venv\STEP>
`
The text was updated successfully, but these errors were encountered: