22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

29 target_seq = X[:, self.input_len:, :]

30 self.init_outputs(X.shape[0])

31

32 # Encoder expected N, L, F

33 hidden_seq = self.encoder(source_seq)

34 # Output is N, L, H

35 self.decoder.init_hidden(hidden_seq)

36

37 # The last input of the encoder is also

38 # the first input of the decoder

39 dec_inputs = source_seq[:, -1:, :]

40

41 # Generates as many outputs as the target length

42 for i in range(self.target_len):

43 # Output of decoder is N, 1, F

44 out = self.decoder(dec_inputs)

45 self.store_output(i, out)

46

47 prob = self.teacher_forcing_prob

48 # In evaluation / test the target sequence is

49 # unknown, so we cannot use teacher forcing

50 if not self.training:

51 prob = 0

52

53 # If it is teacher forcing

54 if torch.rand(1) <= prob:

55 # Takes the actual element

56 dec_inputs = target_seq[:, i:i+1, :]

57 else:

58 # Otherwise uses the last predicted output

59 dec_inputs = out

60

61 return self.outputs

The only real additions are the init_outputs() method, which creates a tensor for

storing the generated target sequence, and the store_output() method, which

actually stores the output produced by the decoder.

700 | Chapter 9 — Part I: Sequence-to-Sequence

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!