16.03.2021 Views

Advanced Deep Learning with Keras

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Disentangled Representation GANs

# ACGAN labels

# concatenate z noise vector and one-hot labels

inputs = [inputs, labels]

else:

# infoGAN codes

# concatenate z noise vector, one-hot labels,

# and codes 1 & 2

inputs = [inputs, labels] + codes

x = concatenate(inputs, axis=1)

elif codes is not None:

# generator 0 of StackedGAN

inputs = [inputs, codes]

x = concatenate(inputs, axis=1)

else:

# default input is just 100-dim noise (z-code)

x = inputs

x = Dense(image_resize * image_resize * layer_filters[0])(x)

x = Reshape((image_resize, image_resize, layer_filters[0]))(x)

for filters in layer_filters:

# first two convolution layers use strides = 2

# the last two use strides = 1

if filters > layer_filters[-2]:

strides = 2

else:

strides = 1

x = BatchNormalization()(x)

x = Activation('relu')(x)

x = Conv2DTranspose(filters=filters,

kernel_size=kernel_size,

strides=strides,

padding='same')(x)

if activation is not None:

x = Activation(activation)(x)

# generator output is the synthesized image x

return Model(inputs, x, name='generator')

The preceding listing shows the discriminator and Q-Network with the original

default GAN output. The three auxiliary outputs corresponding to discrete code

(for one-hot label) softmax prediction and the continuous codes probabilities

given the input MNIST digit image are highlighted.

[ 168 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!