22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Output

-----------------------------------------------------------------

ValueError

Traceback (most recent call last)

<ipython-input-154-9b17f363443c> in <module>

----> 1 torch.as_tensor([x0, x1, x2])

ValueError: expected sequence of length 4 at dim 1 (got 2)

We can use PyTorch’s nn.utils.rnn.pad_sequence() to perform the padding for

us. It takes as arguments a list of sequences, a padding value (default is zero), and

the option to make the result batch-first. Let’s give it a try:

seq_tensors = [torch.as_tensor(seq).float() for seq in all_seqs]

padded = rnn_utils.pad_sequence(seq_tensors, batch_first=True)

padded

Output

tensor([[[ 1.0349, 0.9661],

[ 0.8055, -0.9169],

[-0.8251, -0.9499],

[-0.8670, 0.9342]],

[[-1.0911, 0.9254],

[-1.0771, -1.0414],

[ 0.0000, 0.0000],

[ 0.0000, 0.0000]],

[[-1.1247, -0.9683],

[ 0.8182, -0.9944],

[ 1.0081, 0.7680],

[ 0.0000, 0.0000]]])

Both the second and the third sequences were shorter than the first, so they got

padded accordingly to match the length of the longest sequence.

Variable-Length Sequences | 655

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!