16.03.2021 Views

Advanced Deep Learning with Keras

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

After undergoing two Conv2DTranspose with strides = 2, the feature maps

will have a size of 28 × 28 × number of filters. Each Conv2DTranspose is preceded

by batch normalization and ReLU. The final layer has sigmoid activation that

generates the 28 × 28 × 1 fake MNIST images. Each pixel is normalized to

[0.0, 1.0] corresponding to [0, 255] grayscale levels. Following listing shows the

implementation of the generator network in Keras. A function is defined to build

the generator model. Due to the length of the entire code, we will limit the listing

to the particular lines being discussed.

Chapter 4

The complete code is available on GitHub: https://github.com/

PacktPublishing/Advanced-Deep-Learning-with-Keras.

Listing 4.2.1, dcgan-mnist-4.2.1.py shows us the generator network builder

function for DCGAN:

def build_generator(inputs, image_size):

"""Build a Generator Model

Stack of BN-ReLU-Conv2DTranpose to generate fake images.

Output activation is sigmoid instead of tanh in [1].

Sigmoid converges easily.

# Arguments

inputs (Layer): Input layer of the generator (the z-vector)

image_size: Target size of one side (assuming square image)

# Returns

Model: Generator Model

"""

image_resize = image_size // 4

# network parameters

kernel_size = 5

layer_filters = [128, 64, 32, 1]

x = Dense(image_resize * image_resize * layer_filters[0])(inputs)

x = Reshape((image_resize, image_resize, layer_filters[0]))(x)

for filters in layer_filters:

# first two convolution layers use strides = 2

# the last two use strides = 1

if filters > layer_filters[-2]:

strides = 2

[ 107 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!