09.05.2023 Views

pdfcoffee

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Neural Network Foundations with TensorFlow 2.0

For a given input, several types of output can be computed, including a method

model.evaluate() used to compute the loss values, a method model.predict_

classes() used to compute category outputs, and a method model.predict_

proba() used to compute class probabilities.

A practical overview of backpropagation

Multi-layer perceptrons learn from training data through a process called

backpropagation. In this section, we will cover the basics while more details can be

found in Chapter 15, The Math behind Deep Learning. The process can be described as a

way of progressively correcting mistakes as soon as they are detected. Let's see how

this works.

Remember that each neural network layer has an associated set of weights that

determine the output values for a given set of inputs. Additionally, remember that

a neural network can have multiple hidden layers.

At the beginning, all the weights have some random assignment. Then, the net is

activated for each input in the training set: values are propagated forward from the

input stage through the hidden stages to the output stage where a prediction is

made. Note that we've kept Figure 38 simple by only representing a few values with

green dotted lines but in reality all the values are propagated forward through the

network:

Figure 38: Forward step in backpropagation

Since we know the true observed value in the training set, it is possible to calculate

the error made in prediction. The key intuition for backtracking is to propagate the

error back (see Figure 39), using an appropriate optimizer algorithm such as gradient

descent to adjust the neural network weights with the goal of reducing the error

(again, for the sake of simplicity, only a few error values are represented here):

[ 46 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!