16.03.2021 Views

Advanced Deep Learning with Keras

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Improved GANs

Following figure shows us the outputs of both DCGAN and WGAN with batch

normalization on the discriminator network:

Figure 5.1.5: A comparison of the output of the DCGAN (Left) and WGAN (Right)

when batch normalization is inserted before the ReLU activation in the discriminator network

Similar to the GAN training in the previous chapter, the trained model is saved on

a file after 40,000 train steps. I would encourage you to run the trained generator

model to see new synthesized MNIST digits images:

python3 wgan-mnist-5.1.2.py --generator=wgan_mnist.h5

Least-squares GAN (LSGAN)

As discussed in the previous section, the original GAN is difficult to train.

The problem arises when the GAN optimizes its loss function; it's actually

optimizing the Jensen-Shannon divergence, D JS

. It is difficult to optimize D JS

when there is little to no overlap between two distribution functions.

WGAN proposed to address the problem by using the EMD or Wasserstein

1 loss function which has a smooth differentiable function even when there is little

or no overlap between the two distributions. However, WGAN is not concerned

with the generated image quality. Apart from stability issues, there are still areas of

improvement in terms of perceptive quality in the generated images of the original

GAN. LSGAN theorizes that the twin problems can be solved simultaneously.

LSGAN proposes the least squares loss. Figure 5.2.1 demonstrates why the use of

a sigmoid cross entropy loss in the GAN results in poorly generated data quality.

Ideally, the fake samples distribution should be as close as possible to the true

samples' distribution. However, for GANs, once the fake samples are already

on the correct side of the decision boundary, the gradients vanish.

[ 142 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!