22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Recap

In this chapter, we’ve introduced convolutions and related concepts and built a

convolutional neural network to tackle a multiclass classification problem. This is

what we’ve covered:

• understanding the role of a kernel / filter in a convolution

• understanding the role of a stride and its impact on the shape of the output

• realizing that there are as many filters as combinations of input and output

channels

• using padding to preserve the shape of the output

• using pooling to shrink the shape of the output

• assembling convolution, activation function, and pooling into a typical

convolutional block

• using a sequence of convolutional blocks to pre-process images, converting

them into features

• (re)building Yann LeCun’s LeNet-5

• generating a dataset of 1,000 images for a multiclass classification problem

• understanding how a softmax function transforms logits into probabilities

• understanding the difference between PyTorch’s negative log-likelihood and

cross-entropy losses

• highlighting the importance of choosing the correct combination of last layer

and loss function (again)

• using the loss function to handle imbalanced datasets

• building our own convolutional neural network, with a featurizer made of a

typical convolutional block, followed by a traditional classifier with a single

hidden layer

• visualizing the learned filters

• understanding and using (forward) hooks to capture the outputs of

intermediate layers of our model

• removing the hooks after they served their purpose so as to not harm the

model speed

• using the captured outputs to visualize feature maps and understanding how

414 | Chapter 5: Convolutions

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!