You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello! I've been trying to use the layered, bidirectional RIM, and it throws an error when it's initializing the layers for training. It seems to be having trouble with the size of the newly created hidden state value?
Here's the traceback:
File "/home/mila/m/mashbayar.tugsbayar/Recurrent-Independent-Mechanisms/main_bidirectional.py", line 159, in
train_model(model, args['epochs'], data)
File "/home/mila/m/mashbayar.tugsbayar/Recurrent-Independent-Mechanisms/main_bidirectional.py", line 114, in train_model
output, l = model(inp_x, inp_y)
File "/home/mila/m/mashbayar.tugsbayar/.conda/envs/brim/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl
return forward_call(*input, **kwargs)
File "/home/mila/m/mashbayar.tugsbayar/Recurrent-Independent-Mechanisms/RIM.py", line 336, in forward
x_fw, hs[idx] = self.layer(self.rimcell[idx], x, hs[idx], c = None)
File "/home/mila/m/mashbayar.tugsbayar/Recurrent-Independent-Mechanisms/RIM.py", line 297, in layer
hs = h.squeeze(0).view(batch_size, self.num_units, -1)
Thank you!
The text was updated successfully, but these errors were encountered:
Hello! I've been trying to use the layered, bidirectional RIM, and it throws an error when it's initializing the layers for training. It seems to be having trouble with the size of the newly created hidden state value?
Here's the traceback:
File "/home/mila/m/mashbayar.tugsbayar/Recurrent-Independent-Mechanisms/main_bidirectional.py", line 159, in
train_model(model, args['epochs'], data)
File "/home/mila/m/mashbayar.tugsbayar/Recurrent-Independent-Mechanisms/main_bidirectional.py", line 114, in train_model
output, l = model(inp_x, inp_y)
File "/home/mila/m/mashbayar.tugsbayar/.conda/envs/brim/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl
return forward_call(*input, **kwargs)
File "/home/mila/m/mashbayar.tugsbayar/Recurrent-Independent-Mechanisms/RIM.py", line 336, in forward
x_fw, hs[idx] = self.layer(self.rimcell[idx], x, hs[idx], c = None)
File "/home/mila/m/mashbayar.tugsbayar/Recurrent-Independent-Mechanisms/RIM.py", line 297, in layer
hs = h.squeeze(0).view(batch_size, self.num_units, -1)
Thank you!
The text was updated successfully, but these errors were encountered: