16.03.2021 Views

Advanced Deep Learning with Keras

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Disentangled Representation GANs

The last loss function is similar to the usual GAN loss. It's made of a discriminator

( D)

( G) loss L

i and a generator (through adversarial) loss L adv

i . Following figure shows

us the elements involved in the GAN loss:

Figure 6.2.6: A simpler version of Figure 6.2.3 showing only the network elements

( D)

( G) involved in the computation of L

i

and L adv

i

In Equation 6.2.5, the weighted sum of the three generator loss functions is the final

generator loss function. In the Keras code that we will present, all the weights are

set to 1.0, except for the entropy loss which is set to 10.0. In Equation 6.2.1 to Equation

6.2.5, i refers to the encoder and GAN group id or level. In the original paper, the

network is first trained independently and then jointly. During independent training,

the encoder is trained first. During joint training, both real and fake data are used.

The implementation of the StackedGAN generator and discriminator in Keras

requires few changes to provide auxiliary points to access the intermediate features.

Figure 6.2.7 shows the generator Keras model. Listing 6.2.2 illustrates the function

that builds two generators (gen0 and gen1) corresponding to Generator 0

and

Generator 1

. The gen1 generator is made of three Dense layers with label and the noise

code z 1f

as inputs. The third layer generates the fake f 1f

feature. The gen0 generator is

similar to other GAN generators that we've presented and can be instantiated using

the generator builder in gan.py:

# gen0: feature1 + z0 to feature0 (image)

gen0 = gan.generator(feature1, image_size, codes=z0)

The gen0 input is f 1

features and the noise code z 0

. The output is the generated fake

image, x f

:

[ 186 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!