22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Figure 4.14 - Leaky ReLU function and its gradient

As you can see in the figure above, the Leaky ReLU is pretty much the same as the

ReLU, except for the tiny, barely visible slope on the left-hand side.

Once again, we have two options. Functional (F.leaky_relu()):

dummy_z = torch.tensor([-3., 0., 3.])

F.leaky_relu(dummy_z, negative_slope=0.01)

Output

tensor([-0.0300, 0.0000, 3.0000])

And module (nn.LeakyReLU):

nn.LeakyReLU(negative_slope=0.02)(dummy_z)

Output

tensor([-0.0600, 0.0000, 3.0000])

318 | Chapter 4: Classifying Images

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!