22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

First, let’s compute the logits and corresponding probabilities:

Evaluation

logits_val = sbs.predict(X_val)

probabilities_val = sigmoid(logits_val).squeeze()

Then, let’s visualize the probabilities on a line. It means we’re going from the fancy

contour plot to a simpler plot:

Figure 3.9 - Probabilities on a line

The left plot comes from Figure 3.8. It shows the contour plot of the probabilities

and the decision boundary as a straight gray line. We place the data points on a

line, according to their predicted probabilities. That’s the plot on the right.

The decision boundary is shown as a vertical dashed line placed at the chosen

threshold (0.5). Points to the left of the dashed line are classified as red, and

therefore have red edges around them, while those to the right are classified as

blue, and have blue edges around them.

The points are filled with their actual color, meaning that those with distinct

colors for edge and filling are misclassified. In the figure above, we have one blue

point classified as red (left) and two red points classified as blue (right).

Now, let’s make a tiny change to our plot to make it more visually interesting: We’ll

plot blue (positive) points below the probability line and red (negative) points

above the probability line.

242 | Chapter 3: A Simple Classification Problem

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!