pdfcoffee

soumyasankar99
from soumyasankar99 More from this publisher
09.05.2023 Views

Chapter 10(60000, 28, 28) (60000,)(10000, 28, 28) (10000,)We see some sample images:number = 10 # how many digits we will displayplt.figure(figsize=(20, 4))for index in range(number):# display originalax = plt.subplot(2, number, index + 1)plt.imshow(x_train[index], cmap='gray')ax.get_xaxis().set_visible(False)ax.get_yaxis().set_visible(False)plt.show()Figure 10: Sample images from the Fashion-MNIST datasetBefore we start, let us declare some hyperparameters like learning rate, dimensionsof the hidden layer and the latent space, batch size, epochs, and so on:image_size = x_train.shape[1]*x_train.shape[2]hidden_dim = 512latent_dim = 10num_epochs = 80batch_size = 100learning_rate = 0.001We use the TensorFlow Keras Model API to build a VAE model. The __init__()function defines all the layers that we will be using:class VAE(tf.keras.Model):def __init__(self,dim,**kwargs):h_dim = dim[0]z_dim = dim[1]super(VAE, self).__init__(**kwargs)self.fc1 = tf.keras.layers.Dense(h_dim)self.fc2 = tf.keras.layers.Dense(z_dim)self.fc3 = tf.keras.layers.Dense(z_dim)self.fc4 = tf.keras.layers.Dense(h_dim)self.fc5 = tf.keras.layers.Dense(image_size)[ 401 ]

Unsupervised LearningWe define the functions to give us the encoder output and decoder output andreparametrize. The implementation of the encoder and decoder functions arestraightforward; however, we need to delve a little deeper for the reparametrizefunction. As you know, VAEs sample from a random node z, which is approximatedby qq(zz |Θ) of the true posterior. Now, to get parameters we need to usebackpropagation. However, back propagation cannot work on random nodes. Usingreparameterization, we can use a new parameter eps that allows us to reparametrizez in a way that will allow the back propagation through the deterministic randomnode (https://arxiv.org/pdf/1312.6114v10.pdf):def encode(self, x):h = tf.keras.nn.relu(self.fc1(x))return self.fc2(h), self.fc3(h)def reparameterize(self, mu, log_var):std = tf.exp(log_var * 0.5)eps = tf.random.normal(std.shape)return mu + eps * stddef decode_logits(self, z):h = tf.nn.relu(self.fc4(z))return self.fc5(h)def decode(self, z):return tf.nn.sigmoid(self.decode_logits(z))Lastly, we define the call() function, which will control how signals move throughdifferent layers of the VAE:def call(self, inputs, training=None, mask=None):mu, log_var = self.encode(inputs)z = self.reparameterize(mu, log_var)x_reconstructed_logits = self.decode_logits(z)return x_reconstructed_logits, mu, log_varNow we create the VAE model and declare the optimizer for it. You can see thesummary of the model:model = VAE([hidden_dim, latent_dim])model.build(input_shape=(4, image_size))model.summary()optimizer = tf.keras.optimizers.Adam(learning_rate)[ 402 ]

Chapter 10

(60000, 28, 28) (60000,)

(10000, 28, 28) (10000,)

We see some sample images:

number = 10 # how many digits we will display

plt.figure(figsize=(20, 4))

for index in range(number):

# display original

ax = plt.subplot(2, number, index + 1)

plt.imshow(x_train[index], cmap='gray')

ax.get_xaxis().set_visible(False)

ax.get_yaxis().set_visible(False)

plt.show()

Figure 10: Sample images from the Fashion-MNIST dataset

Before we start, let us declare some hyperparameters like learning rate, dimensions

of the hidden layer and the latent space, batch size, epochs, and so on:

image_size = x_train.shape[1]*x_train.shape[2]

hidden_dim = 512

latent_dim = 10

num_epochs = 80

batch_size = 100

learning_rate = 0.001

We use the TensorFlow Keras Model API to build a VAE model. The __init__()

function defines all the layers that we will be using:

class VAE(tf.keras.Model):

def __init__(self,dim,**kwargs):

h_dim = dim[0]

z_dim = dim[1]

super(VAE, self).__init__(**kwargs)

self.fc1 = tf.keras.layers.Dense(h_dim)

self.fc2 = tf.keras.layers.Dense(z_dim)

self.fc3 = tf.keras.layers.Dense(z_dim)

self.fc4 = tf.keras.layers.Dense(h_dim)

self.fc5 = tf.keras.layers.Dense(image_size)

[ 401 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!