22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

out, h = rnn_forward(x)

out_rev, h_rev = rnn_reverse(x_rev)

out_rev_back = torch.flip(out_rev, dims=[1])

The overall output of the bidirectional RNN must have two elements as well:

• A concatenation side-by-side of both sequences of hidden states (out and

out_rev_back).

• The concatenation of the final hidden states of both layers.

torch.cat([out, out_rev_back], dim=2), torch.cat([h, h_rev])

Output

(tensor([[[ 0.3924, 0.8146, -0.9355, -0.8353],

[ 0.4347, -0.0481, -0.1766, 0.2596],

[-0.1521, -0.3367, 0.8829, 0.0425],

[-0.5297, 0.3551, -0.2032, -0.7901]]], grad_fn

=<CatBackward>),

tensor([[[-0.5297, 0.3551]],

[[-0.9355, -0.8353]]], grad_fn=<CatBackward>))

Done! We’ve replicated the inner workings of a bidirectional RNN using two simple

RNNs. You can double-check the results by feeding the sequence of data points to

the actual bidirectional RNN:

out, hidden = rnn_bidirect(x)

And, once again, you’ll get the very same results.

614 | Chapter 8: Sequences

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!