22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

The solution found by the model was to rotate it to the right and scale it up (the

linear transformation), and then translate it to the right and up (making it an affine

transformation). That was achieved by "Hidden Layer #0." Then the sigmoid

activation function turned that transformed feature space into these oddly

shaped figures in the right column. In the final plot, the resulting activated feature

space looks like it was "zoomed in" on its center, as if we were looking at it through a

magnifying glass. Notice the ranges in the right-side plots: They are restricted to

the (0, 1) interval. That’s the range of the sigmoid. What if we try a different

activation function?

More Functions, More Boundaries

Let’s try the hyperbolic tangent.

Figure B.7 - Activated feature space—TanH

It is actually quite similar—especially the transformed feature space of the hidden

layer. The range of the activated feature space is different though: It is restricted to

the (-1, 1) interval, corresponding to the range of the hyperbolic tangent.

What about the famous ReLU?

Figure B.8 - Activated feature space—ReLU

338 | Bonus Chapter: Feature Space

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!