09.05.2023 Views

pdfcoffee

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Neural Network Foundations with TensorFlow 2.0

Two additional activation functions – ELU and

LeakyReLU

Sigmoid and ReLU are not the only activation functions used for learning.

ELU is defined as ff(αα, xx) = { αα(eexx − 1)

xx

represented in Figure 9:

iiii xx ≤ 0

for αα > 0 and its plot is

iiii xx > 0

Figure 9: An ELU function

LeakyReLU is defined as ff(αα, xx) = { αααα

xx

in Figure 10:

iiii xx ≤ 0

iiii xx > 0

for αα > 0 and its plot is represented

Figure 10: A LeakyReLU function

[ 12 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!