09.05.2023 Views

pdfcoffee

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Autoencoders

An impressive reconstruction of images from noisy images, I'm sure you'll agree.

You can access the code in the notebook DenoisingAutoencoder.ipynb if you want

to play around with it.

Stacked autoencoder

Until now we have restricted ourselves to autoencoders with only one hidden layer.

We can build Deep autoencoders by stacking many layers of both encoder and

decoder; such an autoencoder is called a Stacked autoencoder. The features extracted

by one encoder are passed on to the next encoder as input. The stacked autoencoder

can be trained as a whole network with an aim to minimize the reconstruction

error. Or each individual encoder/decoder network can first be pretrained using

the unsupervised method you learned earlier, and then the complete network can

be fine-tuned. When the deep autoencoder network is a convolutional network, we

call it a Convolutional Autoencoder. Let us implement a convolutional autoencoder

in TensorFlow 2.0 next.

Convolutional autoencoder for removing

noise from images

In the previous section we reconstructed handwritten digits from noisy input

images. We used a fully connected network as the encoder and decoder for the

work. However, we know that for images, a convolutional al network can give better

results, so in this section we will use a convolution network for both the encoder and

decoder. To get better results we will use multiple convolution layers in both the

encoder and decoder networks; that is, we will make stacks of convolutional layers

(along with maxpooling or upsample layers). We will also be training the entire

autoencoder as a single entity.

1. We import all the required modules; also for convenience import specific

layers from tensorflow.keras.layers:

import numpy as np

import tensorflow as tf

import tensorflow.keras as K

[ 360 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!