Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

peiying410632
from peiying410632 More from this publisher
22.02.2024 Views

Figure 3.13 - Using a low thresholdYou can see in the figure above that lowering the threshold (moving it to the lefton the probability line) turned one false negative into a true positive (blue pointclose to 0.4), but it also turned one true negative into a false positive (red pointclose to 0.4).Let’s double-check it with Scikit-Learn’s confusion matrix:confusion_matrix(y_val, (probabilities_val >= 0.3))Outputarray([[ 6, 3],[ 0, 11]])OK, now let’s plot the corresponding metrics one more time:Figure 3.14 - Trade-offs for two different thresholdsStill not a curve, I know, but we can already learn something from these two points.Classification Threshold | 251

Lowering the threshold moves you to the right along bothcurves.Let’s move to the other side now.High ThresholdWhat about 70%? If the predicted probability is greater than or equal to 70%, weclassify the data point as positive, and as negative otherwise. That’s a very strictthreshold since we require the model to be very confident to consider a data pointto be positive. What can we expect from it? Fewer false positives, more falsenegatives.Figure 3.15 - Using a high thresholdYou can see in the figure above that raising the threshold (moving it to the right onthe probability line) turned two false positives into true negatives (red pointsclose to 0.6), but it also turned one true positive into a false negative (blue pointclose to 0.6).Let’s double-check it with Scikit-Learn’s confusion matrix:confusion_matrix(y_val, (probabilities_val >= 0.7))Outputarray([[9, 0],[2, 9]])OK, now let’s plot the corresponding metrics again:252 | Chapter 3: A Simple Classification Problem

Figure 3.13 - Using a low threshold

You can see in the figure above that lowering the threshold (moving it to the left

on the probability line) turned one false negative into a true positive (blue point

close to 0.4), but it also turned one true negative into a false positive (red point

close to 0.4).

Let’s double-check it with Scikit-Learn’s confusion matrix:

confusion_matrix(y_val, (probabilities_val >= 0.3))

Output

array([[ 6, 3],

[ 0, 11]])

OK, now let’s plot the corresponding metrics one more time:

Figure 3.14 - Trade-offs for two different thresholds

Still not a curve, I know, but we can already learn something from these two points.

Classification Threshold | 251

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!