22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Output

[0.49671415] [-0.1382643]

[0.80119529] [0.04511107]

Step 5 - Rinse and Repeat!

Now we use the updated parameters to go back to Step 1 and restart the process.

Definition of Epoch

An epoch is complete whenever every point in the training set

(N) has already been used in all steps: forward pass, computing

loss, computing gradients, and updating parameters.

During one epoch, we perform at least one update, but no more

than N updates.

The number of updates (N/n) will depend on the type of gradient

descent being used:

• For batch (n = N) gradient descent, this is trivial, as it uses all

points for computing the loss—one epoch is the same as one

update.

• For stochastic (n = 1) gradient descent, one epoch means N

updates, since every individual data point is used to perform

an update.

• For mini-batch (of size n), one epoch has N/n updates, since a

mini-batch of n data points is used to perform an update.

Repeating this process over and over for many epochs is, in a nutshell, training a

model.

Linear Regression in Numpy

It’s time to implement our linear regression model using gradient descent and

Numpy only.

Linear Regression in Numpy | 67

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!