22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

If that’s too large for your GPU, though, don’t worry: There are many different

versions of BERT for all tastes and budgets, and you can find them in Google

Research’s BERT repository. [202]

You can also check BERT’s documentation [203] and model card, [204]

available at HuggingFace, for a quick overview of the model and

its training procedure.

For a general overview of BERT, please check Jay Alammar’s

excellent posts on the topic: "The Illustrated BERT, ELMo, and co.

(How NLP Cracked Transfer Learning)" [205] and "A Visual Guide to

Using BERT for the First Time." [206]

AutoModel

If you want to quickly try different models without having to import their

corresponding classes, you can use HuggingFace’s AutoModel instead:

from transformers import AutoModel

auto_model = AutoModel.from_pretrained('bert-base-uncased')

print(auto_model.__class__)

Output

<class 'transformers.modeling_bert.BertModel'>

As you can see, it infers the correct model class based on the name of the

model you’re loading, e.g., bert-base-uncased.

Let’s create our first BERT model by loading the pre-trained weights for bertbase-uncased:

from transformers import BertModel

bert_model = BertModel.from_pretrained('bert-base-uncased')

966 | Chapter 11: Down the Yellow Brick Rabbit Hole

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!