22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Helper Function to Retrieve Embeddings

1 def get_embeddings(embeddings, sentence):

2 sent = Sentence(sentence)

3 embeddings.embed(sent)

4 return torch.stack(

5 [token.embedding for token in sent.tokens]

6 ).float()

get_embeddings(elmo, watch1)

Output

tensor([[-0.3288, 0.2022, -0.5940, ..., 1.0606, 0.2637],

[-0.7142, 0.4210, -0.9504, ..., -0.6684, 1.7245],

[ 0.2981, -0.0738, -0.1319, ..., 1.1165, 0.6453],

...,

[ 0.0475, 0.2325, -0.2013, ..., -0.5294, -0.8543],

[ 0.1599, 0.6898, 0.2946, ..., 0.9584, 1.0337],

[-0.8872, -0.2004, -1.0601, ..., -0.0841, 0.0618]],

device='cuda:0')

The returned tensor has 58 embeddings of 3,072 dimensions each.

For more details on ELMo embeddings, please check "ELMo

Embeddings" [192] and "Tutorial 4: List of All Word Embeddings."

[193]

Contextual Word Embeddings | 953

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!