09.05.2023 Views

pdfcoffee

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

The Math Behind Deep Learning

The easiest way to think about backtracking is to propagate the error back (see

Figure 10), using an appropriate optimizer algorithm such as a gradient descent to

adjust the neural network weights with the goal of reducing the error (again for

the sake of simplicity only a few error values are represented here):

Figure 10: Backward step in backpropagation

The process of forward propagation from input to output and backward propagation

of errors is repeated several times until the error goes below a predefined threshold.

The whole process is represented in Figure 11. A set of features is selected as input

to a machine learning model, which produces predictions. The predictions are

compared with the (true) label and the resulting loss function is minimized by the

optimizer, which updates the weights of the model:

Figure 11: Forward propagation and backward propagation

[ 552 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!