22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

The three units in the output layer produce three logits, one for each class (C 0 , C 1 ,

and C 2 ). We could have added an nn.LogSoftmax layer to the model, and it would

have converted the three logits to log probabilities.

Since our model produces logits, we must use the nn.CrossEntropyLoss()

function:

Model Configuration — Loss and Optimizer

1 lr = 0.1

2 multi_loss_fn = nn.CrossEntropyLoss(reduction='mean')

3 optimizer_cnn1 = optim.SGD(model_cnn1.parameters(), lr=lr)

And then we create an optimizer (SGD) with a given learning rate (0.1), as usual.

Boring, right? No worries; we’ll finally change the optimizer in the Rock Paper

Scissors classification problem in the next chapter.

Model Training

This part is completely straightforward. First, we instantiate our class and set the

loaders:

Model Training

1 sbs_cnn1 = StepByStep(model_cnn1, multi_loss_fn, optimizer_cnn1)

2 sbs_cnn1.set_loaders(train_loader, val_loader)

Then, we train it for 20 epochs and visualize the losses:

Model Training

1 sbs_cnn1.train(20)

fig = sbs_cnn1.plot_losses()

A Multiclass Classification Problem | 387

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!