22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

The value z above is the output of our model. It is a "glorified linear regression!"

And this is a classification problem! How come?! Hold that thought; it will become

more clear in the next section, "Decision Boundary".

But, before going down that road, I would like to use our model (and the

StepByStep class) to make predictions for, say, the first four data points in our

training set:

Making Predictions (Logits)

predictions = sbs.predict(x_train_tensor[:4])

predictions

Output

array([[ 0.20252657],

[ 2.944347 ],

[ 3.6948545 ],

[-1.2356305 ]], dtype=float32)

Clearly, these are not probabilities, right? These are logits, as expected.

We can still get the corresponding probabilities, though.

"How do we go from logits to probabilities," you ask, just to make

sure you got it right.

That’s what the sigmoid function is good for.

Making Predictions (Probabilities)

probabilities = sigmoid(predictions)

probabilities

234 | Chapter 3: A Simple Classification Problem

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!