-
Notifications
You must be signed in to change notification settings - Fork 74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error in Chapter_6.ipynb #4
Comments
Can you post which version of PyTorch you get with your Colab session and this error? And does this error also occur for you in Chapter 4? The last few updates of PyTorch have annoyingly had some breaking changes that don't occur in the change log, so I may have missed a new one since I wrote this chapter. |
Yes, same error in Chapter 4 in:
And on my colab, I am using PyTorch |
Ok, I think I know what is happening here. If you switch to 1.6.X it should go away. I'll get this fixed once I get to revisions of chapter 4 or 6 with the editors. If you go to the code: class EmbeddingPackable(nn.Module):
"""
The embedding layer in PyTorch does not support Packed Sequence objects.
This wrapper class will fix that. If a normal input comes in, it will
use the regular Embedding layer. Otherwise, it will work on the packed
sequence to return a new Packed sequence of the appropriate result.
"""
def __init__(self, embd_layer):
super(EmbeddingPackable, self).__init__()
self.embd_layer = embd_layer
def forward(self, input):
if type(input) == torch.nn.utils.rnn.PackedSequence:
# We need to unpack the input,
sequences, lengths = torch.nn.utils.rnn.pad_packed_sequence(input.cpu(), batch_first=True)
#Embed it
sequences = self.embd_layer(sequences.to(input.data.device))
#And pack it into a new sequence
return torch.nn.utils.rnn.pack_padded_sequence(sequences, lengths.to(input.data.device),
batch_first=True, enforce_sorted=False)
else:#apply to normal data
return self.embd_layer(input) and change |
When running in colab (using GPU) I got the following error in cell:
Error is:
/usr/local/lib/python3.6/dist-packages/torch/nn/utils/rnn.py in pack_padded_sequence(input, lengths, batch_first, enforce_sorted)
The text was updated successfully, but these errors were encountered: