22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Output

Token: 32 watch

The get_token() method assumes indexing starts at one, while the tokens

attribute has the typical zero-based indexing:

flair_sentences[0].tokens[31]

Output

Token: 32 watch

To learn more about the Sentence object in flair, please check

"Tutorial 1: NLP Base Types." [191]

Then, we can use these Sentence objects to retrieve contextual word embeddings.

But, first, we need to actually load ELMo using ELMoEmbeddings:

from flair.embeddings import ELMoEmbeddings

elmo = ELMoEmbeddings()

elmo.embed(flair_sentences)

Output

[Sentence: "The Hatter was the first to break the silence . ` What

day of the month is it ? ' he said , turning to Alice : he had taken

his watch out of his pocket , and was looking at it uneasily ,

shaking it every now and then , and holding it to his ear ." [

Tokens: 58],

Sentence: "Alice thought this a very curious thing , and she went

nearer to watch them , and just as she came up to them she heard one

of them say , ` Look out now , Five ! Do n't go splashing paint over

me like that !" [ Tokens: 48]]

Contextual Word Embeddings | 951

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!