09.05.2023 Views

pdfcoffee

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Neural Network Foundations with TensorFlow 2.0

Note that it has been frequently observed that networks with random dropout in

internal hidden layers can "generalize" better on unseen examples contained in test

sets. Intuitively, we can consider this phenomenon as each neuron becoming more

capable because it knows it cannot depend on its neighbors. Also, because it forces

information to be stored in a redundant way. During testing there is no dropout,

so we are now using all our highly tuned neurons. In short, it is generally a good

approach to test how a net performs when a dropout function is adopted.

Besides that, note that training accuracy should still be above test accuracy,

otherwise, we might be not training for long enough. This is the case in our example

and therefore we should increase the number of epochs. However, before performing

this attempt we need to introduce a few other concepts that allow the training to

converge faster. Let's talk about optimizers.

Testing different optimizers in TensorFlow 2.0

Now that we have defined and used a network, it is useful to start developing some

intuition about how networks are trained, using an analogy. Let us focus on one

popular training technique known as Gradient Descent (GD). Imagine a generic cost

function C(w) in one single variable w as shown in Figure 18:

Figure 18: An example of gradient descent optimization

The gradient descent can be seen as a hiker who needs to navigate down a steep

slope and aims to enter a ditch. The slope represents the function C while the ditch

represents the minimum C min

. The hiker has a starting point w 0

. The hiker moves little

by little; imagine that there is almost zero visibility, so the hiker cannot see where

to go automatically, and they proceed in a zigzag. At each step r, the gradient is the

direction of maximum increase.

[ 26 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!