09.05.2023 Views

pdfcoffee

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Autoencoders

self.encoder = Encoder(hidden_dim=hidden_dim)

self.decoder = Decoder(hidden_dim=hidden_dim, original_

dim=original_dim)

def call(self, input_features):

encoded = self.encoder(input_features)

reconstructed = self.decoder(encoded)

return reconstructed

In the next section we will use the autoencoder that we defined here to reconstruct

handwritten digits.

Reconstructing handwritten digits using an

autoencoder

Now that we have our model autoencoder with its layer encoder and decoder ready,

let us try to reconstruct handwritten digits. The complete code is available in the

GitHub repo of the chapter in the notebook VanillaAutoencoder.ipynb. The code

will require the NumPy, TensorFlow, and Matplotlib modules:

import numpy as np

import tensorflow as tf

import tensorflow.keras as K

import matplotlib.pyplot as plt

Before starting with the actual implementation, let's also define some

hyperparameters. If you play around with them, you will notice that even though the

architecture of your model remains the same, there is a significant change in model

performance. Hyperparameter tuning (refer to Chapter 1, Neural Network Foundations

with TensorFlow 2.0, for more details) is one of the important steps in deep learning.

For reproducibility, we set the seeds for random calculation:

np.random.seed(11)

tf.random.set_seed(11)

batch_size = 256

max_epochs = 50

learning_rate = 1e-3

momentum = 8e-1

hidden_dim = 128

original_dim = 784

For training data, we are using the MNIST dataset available in the TensorFlow

datasets. We normalize the data so that pixel values lie between [0,1]; this is

achieved by simply dividing each pixel element by 255.

[ 350 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!