16.03.2021 Views

Advanced Deep Learning with Keras

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Generative Adversarial Networks (GANs)

• Use of Leaky ReLU in all layers of the discriminator. Unlike ReLU, instead of

zeroing out all outputs when the input is less than zero, Leaky ReLU generates

a small gradient equal to alpha × input. In the following example, alpha = 0.2.

Figure 4.2.1: A DCGAN model

The generator learns to generate fake images from 100-dim input vectors ([-1.0, 1.0]

range 100-dim random noise with uniform distribution). The discriminator classifies

real from fake images but inadvertently coaches the generator how to generate real

images when the adversarial network is trained. The kernel size used in our DCGAN

implementation is 5, this is to allow it to increase the coverage and expressive power

of the convolution.

The generator accepts the 100-dim z-vector generated by a uniform distribution

with a range of -1.0 to 1.0. The first layer of the generator is a 7 × 7 ×128 = 6,272 -

unit Dense layer. The number of units is computed based on the intended ultimate

dimensions of the output image (28 × 28 × 1, 28 is a multiple of 7) and the number

of filters of the first Conv2DTranspose, which is equal to 128. We can imagine

transposed CNNs (Conv2DTranspose) as the reversed process of CNN. In a

simple example, if a CNN converts an image to feature maps, a transposed CNN

will produce an image given feature maps. Hence, transposed CNNs were used

in the decoder in the previous chapter and here on generators.

[ 106 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!