16.03.2021 Views

Advanced Deep Learning with Keras

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Improved GANs

Figure 5.1.3: Top: Training the WGAN discriminator requires fake data from the generator and real data from the

true distribution. Bottom: Training the WGAN generator requires fake data from the generator pretending to be real.

Similar to GANs, WGAN alternately trains the discriminator and generator

(through adversarial). However, in WGAN, the discriminator (also called the critic)

trains n critic

iterations (Lines 2 to 8) before training the generator for one iteration

(Lines 9 to 11). This in contrast to GANs with an equal number of training iteration

for both discriminator and generator. Training the discriminator means learning the

parameters (weights and biases) of the discriminator. This requires sampling a batch

from the real data (Line 3) and a batch from the fake data (Line 4) and computing

the gradient of discriminator parameters (Line 5) after feeding the sampled data

to the discriminator network. The discriminator parameters are optimized using

RMSProp (Line 6). Both lines 5 and 6 are the optimization of Equation 5.1.21.

Adam was found to be unstable in WGAN.

[ 134 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!