16.03.2021 Views

Advanced Deep Learning with Keras

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Improved GANs

The functions include:

• generator(): A generator model builder

• discriminator(): Discriminator model builder

• train(): DCGAN trainer

• plot_images(): Generic generator outputs plotter

• test_generator(): Generic generator test utility

As shown in Listing 5.1.1, we can build a discriminator by simply calling:

discriminator = gan.discriminator(inputs, activation='linear')

WGAN uses linear output activation. For the generator, we execute:

generator = gan.generator(inputs, image_size)

The overall network model in Keras is similar to the one seen in Figure 4.2.1

for DCGAN.

Listing 5.1.1 highlights the use of the RMSprop optimizer and Wasserstein loss

function. The hyper-parameters in Algorithm 5.1.1 are used during training. Listing

5.1.2 is the training function that closely follows the algorithm. However, there

is a minor tweak in the training of the discriminator. Instead of training the weights

in a single combined batch of both real and fake data, we'll train with one batch

of real data first and then a batch of fake data. This tweak will prevent the gradient

from vanishing because of the opposite sign in the label of real and fake data and

the small magnitude of weights due to clipping.

The complete code is available on GitHub:

https://github.com/PacktPublishing/Advanced-Deep-

Learning-with-Keras

Figure 5.1.4 shows the evolution of the WGAN outputs on MNIST dataset.

Listing 5.1.1, wgan-mnist-5.1.2.py. The WGAN model instantiation and training.

Both discriminator and generator use Wassertein 1 loss, wasserstein_loss():

def build_and_train_models():

# load MNIST dataset

(x_train, _), (_, _) = mnist.load_data()

# reshape data for CNN as (28, 28, 1) and normalize

image_size = x_train.shape[1]

x_train = np.reshape(x_train, [-1, image_size, image_size, 1])

[ 136 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!