16.03.2021 Views

Advanced Deep Learning with Keras

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Introducing Advanced Deep Learning with Keras

The tanh function maps its input in the range -1.0 to 1.0. This is important if

the output can swing in both positive and negative values. The tanh function is

more popularly used in the internal layer of recurrent neural networks but has

also been used as output layer activation. If tanh is used to replace sigmoid in the

output activation, the data used must be scaled appropriately. For example, instead

x

of scaling each grayscale pixel in the range [0.0 1.0] using x = , it is assigned

x −127.5

255

in the range [-1.0 1.0] by x = .

127.5

The following graph shows the sigmoid and tanh functions. Mathematically,

sigmoid can be expressed in equation as follows:

sigmoid x

1

= = (Equation 1.3.6)

− x

1 + e

( ) σ ( x)

Figure 1.3.6: Plots of sigmoid and tanh

[ 16 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!