16.03.2021 Views

Advanced Deep Learning with Keras

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Generative Adversarial Networks (GANs)

Listing 4.3.3 highlights the changes made in the train() function to accommodate

the conditioning one-hot vector for the discriminator and the generator. The CGAN

discriminator is firstly trained with one batch of real and fake data conditioned

on their respective one-hot labels. Then, the generator parameters are updated

by training the adversarial network given one-hot label conditioned fake data

pretending to be real. Similar to DCGAN, the discriminator weights are frozen

during adversarial training.

Listing 4.3.3, cgan-mnist-4.3.1.py shows us the CGAN training. In highlight

are the changes made in DCGAN:

def train(models, data, params):

"""Train the Discriminator and Adversarial Networks

Alternately train Discriminator and Adversarial networks by batch.

Discriminator is trained first with properly labelled real and

fake images.

Adversarial is trained next with fake images pretending to be

real.

Discriminator inputs are conditioned by train labels for real

images,

and random labels for fake images.

Adversarial inputs are conditioned by random labels.

Generate sample images per save_interval.

# Arguments

models (list): Generator, Discriminator, Adversarial models

data (list): x_train, y_train data

params (list): Network parameters

"""

# the GAN models

generator, discriminator, adversarial = models

# images and labels

x_train, y_train = data

# network parameters

batch_size, latent_size, train_steps, num_labels, model_name =

params

# the generator image is saved every 500 steps

save_interval = 500

# noise vector to see how the generator output evolves during

training

noise_input = np.random.uniform(-1.0, 1.0, size=[16, latent_size])

# one-hot label the noise will be conditioned to

[ 120 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!