22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Figure 11.19 - Losses—bag-of-embeddings (BoE)

StepByStep.loader_apply(test_loader, sbs_emb.correct)

Output

tensor([[380, 440],

[311, 331]])

That’s 89.62% accuracy on the test set. Not bad, not bad at all!

"OK, but I don’t want to use a Sequential model, I want to use a

Transformer!"

I hear you.

Model II — GloVe + Transformer

We’ll use a Transformer encoder as a classifier again, just like we did in the "Vision

Transformer" section in Chapter 10. The model is pretty much the same except that

we’re using pre-trained word embeddings instead of patch embeddings:

Model Configuration

1 class TransfClassifier(nn.Module):

2 def __init__(self, embedding_layer, encoder, n_outputs):

3 super().__init__()

4 self.d_model = encoder.d_model

5 self.n_outputs = n_outputs

942 | Chapter 11: Down the Yellow Brick Rabbit Hole

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!