16.03.2021 Views

Advanced Deep Learning with Keras

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Chapter 7

preal_target,

reco_source,

reco_target,

iden_source,

iden_target]

else:

loss = ['mse', 'mse', 'mae', 'mae']

loss_weights = [1., 1., 10., 10.]

inputs = [source_input, target_input]

outputs = [preal_source,

preal_target,

reco_source,

reco_target]

# build adversarial model

adv = Model(inputs, outputs, name='adversarial')

optimizer = RMSprop(lr=lr*0.5, decay=decay*0.5)

adv.compile(loss=loss,

loss_weights=loss_weights,

optimizer=optimizer,

metrics=['accuracy'])

print('---- ADVERSARIAL NETWORK ----')

adv.summary()

return g_source, g_target, d_source, d_target, adv

We follow the training procedure in Algorithm 7.1.1 from the previous section.

Following listing shows the CycleGAN training. The minor difference between

this training from the vanilla GAN is there are two discriminators to be optimized.

However, there is only one adversarial model to optimize. For every 2000 steps,

the generators save the predicted source and target images. We'll use a batch size

of 32. We also tried a batch size of one, but the output quality is almost the same

and takes a longer amount of time to train (43 ms/image for a batch size of one vs.

3.6 ms/image for a batch size of 32 on an NVIDIA GTX 1060).

Listing 7.1.5, cyclegan-7.1.1.py shows us the CycleGAN training routine in Keras:

def train_cyclegan(models, data, params, test_params, test_generator):

""" Trains the CycleGAN.

1) Train the target discriminator

2) Train the source discriminator

3) Train the forward and backward cyles of adversarial networks

Arguments:

[ 221 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!