09.05.2023 Views

pdfcoffee

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Generative Adversarial Networks

# Create a wall of generated MNIST images

def saveGeneratedImages(epoch, examples=100, dim=(10, 10),

figsize=(10, 10)):

noise = np.random.normal(0, 1, size=[examples, randomDim])

generatedImages = generator.predict(noise)

generatedImages = generatedImages.reshape(examples, 28, 28)

plt.figure(figsize=figsize)

for i in range(generatedImages.shape[0]):

plt.subplot(dim[0], dim[1], i+1)

plt.imshow(generatedImages[i], interpolation='nearest',

cmap='gray_r')

plt.axis('off')

plt.tight_layout()

plt.savefig('images/gan_generated_image_epoch_%d.png' % epoch)

The complete code for this can be found in the notebook VanillaGAN.ipynb at

the GitHub repo for this chapter. In the coming sections we will cover some recent

GAN architectures and implement them in TensorFlow.

Deep convolutional GAN (DCGAN)

Proposed in 2016, DCGANs have become one of the most popular and successful

GAN architectures. The main idea in the design was using convolutional layers

without the use of pooling layers or the end classifier layers. The convolutional

strides and transposed convolutions are employed for the downsampling and

upsampling of images.

Before going into the details of the DCGAN architecture and its capabilities,

let us point out the major changes that were introduced in the paper:

• The network consisted of all convolutional layers. The pooling layers were

replaced by strided convolutions in the discriminator and transposed

convolutions in the generator.

• The fully connected classifying layers after the convolutions are removed.

• To help with the gradient flow, batch normalization is done after every

convolutional layer.

The basic idea of DCGANs is same as the vanilla GAN: we have a generator that

takes in noise of 100 dimensions; the noise is projected and reshaped, and then is

passed through convolutional layers. The following diagram shows the generator

architecture:

[ 198 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!