22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

preceding hidden layer to compute its corresponding output (z 2 ).

8. z 2 is a logit, which is converted to a probability using a sigmoid function.

There are a couple of things to highlight:

• All units in the hidden layers, and the one in the output layer, take a set of

inputs (x or z) and perform the same operation (w T x or w T z, each using its own

weights, of course), producing an output (z).

• In the hidden layers, these operations are exactly like the logistic regression

models we have used so far, up to the point where the logistic regression

produced a logit.

• It is perfectly fine to think of the outputs of one layer as features of the next

layer; actually, this is at the heart of the transfer learning technique we’ll see in

Chapter 7.

• For a binary classification problem, the output layer is a logistic regression,

where the "features" are the outputs produced by the previous hidden layer.

Not so complicated, right? It actually seems like a natural extension of the logistic

regression. Let’s see how it performs in practice.

Deep-ish Model | 303

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!