Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

peiying410632
from peiying410632 More from this publisher
22.02.2024 Views

Now we’re talking! The last hidden state, (-0.5297, 0.3551), is the representation ofthe full sequence.Figure 8.9 depicts what the loop above looks like at the neuron level. In it, you caneasily see what I call "the journey of a hidden state": It is transformed, translated(adding the input), and activated many times over. Moreover, you can also see thatthe data points are independently transformed—the model will learn the best wayto transform them. We’ll get back to this after training a model.At this point, you may be thinking:"Looping over the data points in a sequence?! That looks like a lot ofwork!"And you’re absolutely right! Instead of an RNN cell, we can use a full-fledged…RNN LayerThe nn.RNN layer takes care of the hidden state handling for us, no matter how longthe input sequence is. This is the layer we’ll actually be using in the model. We’vebeen through the inner workings of its cells, but the full-fledged RNN offers manymore options (stacked and / or bidirectional layers, for instance) and one trickything regarding the shapes of inputs and outputs (yes, shapes are a kinda recurrentproblem—pun intended!).Recurrent Neural Networks (RNNs) | 601

602 | Chapter 8: SequencesFigure 8.9 - Multiple cells in sequence

602 | Chapter 8: Sequences

Figure 8.9 - Multiple cells in sequence

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!