09.05.2023 Views

pdfcoffee

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Remember that a neural network can have multiple hidden layers, as well as one

input layer and one output layer.

Chapter 15

In addition to that, recall from Chapter 1, Neural Network Foundations with

TensorFlow 2.0, that backpropagation can be described as a way of progressively

correcting mistakes as soon as they are detected. In order to reduce the errors

made by a neural network, we must train the network. The training needs a dataset

including input values and the corresponding true output value. We want to use

the network for predicting the output as close as possible to the true output value.

The key intuition of the backpropagation algorithm is to update the weights of the

connections based on the measured error at the output neuron(s). In the remainder of

this section, we will explain how to formalize this intuition.

When backpropagation starts, all the weights have some random assignment.

Then the net is activated for each input in the training set: values are propagated

forward from the input stage through the hidden stages to the output stage where

a prediction is made (note that we keep the following figure simple by only

representing a few values with green dotted lines, but in reality all the values are

propagated forward through the network):

Figure 9: Forward step in backpropagation

Since we know the true observed value in the training set, it is possible to calculate

the error made in prediction.

[ 551 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!