pdfcoffee

soumyasankar99
from soumyasankar99 More from this publisher
09.05.2023 Views

Chapter 9self.conv4 = Conv2D(1, 3, 1, activation='sigmoid',padding='same')self.upsample = UpSampling2D((2, 2))def call(self, encoded):x = self.conv1(encoded)#print("dx1", x.shape)x = self.upsample(x)#print("dx2", x.shape)x = self.conv2(x)x = self.upsample(x)x = self.conv3(x)x = self.upsample(x)return self.conv4(x)6. We combine the encoder and decoder to make an autoencoder model.This remains exactly the same as before:class Autoencoder(K.Model):def __init__(self, filters):super(Autoencoder, self).__init__()self.encoder = Encoder(filters)self.decoder = Decoder(filters)def call(self, input_features):#print(input_features.shape)encoded = self.encoder(input_features)#print(encoded.shape)reconstructed = self.decoder(encoded)#print(reconstructed.shape)return reconstructed7. Now we instantiate our model, then specify the binary cross entropy as theloss function and Adam as the optimizer in the compile() method. Then, fitthe model to the training dataset:model = Autoencoder(filters)model.compile(loss='binary_crossentropy', optimizer='adam')loss = model.fit(x_train_noisy,x_train,validation_data=(x_test_noisy, x_test),epochs=max_epochs,batch_size=batch_size)[ 363 ]

Autoencoders8. You can see the loss curve as the model is trained; in 50 epochs the loss wasreduced to 0.0988:plt.plot(range(max_epochs), loss.history['loss'])plt.xlabel('Epochs')plt.ylabel('Loss')plt.show()9. And finally, you can see the wonderful reconstructed images from the noisyinput images:number = 10 # how many digits we will displayplt.figure(figsize=(20, 4))for index in range(number):# display originalax = plt.subplot(2, number, index + 1)plt.imshow(x_test_noisy[index].reshape(28, 28), cmap='gray')ax.get_xaxis().set_visible(False)ax.get_yaxis().set_visible(False)# display reconstructionax = plt.subplot(2, number, index + 1 + number)plt.imshow(tf.reshape(model(x_test_noisy)[index], (28, 28)),cmap='gray')ax.get_xaxis().set_visible(False)ax.get_yaxis().set_visible(False)plt.show()[ 364 ]

Chapter 9

self.conv4 = Conv2D(1, 3, 1, activation='sigmoid',

padding='same')

self.upsample = UpSampling2D((2, 2))

def call(self, encoded):

x = self.conv1(encoded)

#print("dx1", x.shape)

x = self.upsample(x)

#print("dx2", x.shape)

x = self.conv2(x)

x = self.upsample(x)

x = self.conv3(x)

x = self.upsample(x)

return self.conv4(x)

6. We combine the encoder and decoder to make an autoencoder model.

This remains exactly the same as before:

class Autoencoder(K.Model):

def __init__(self, filters):

super(Autoencoder, self).__init__()

self.encoder = Encoder(filters)

self.decoder = Decoder(filters)

def call(self, input_features):

#print(input_features.shape)

encoded = self.encoder(input_features)

#print(encoded.shape)

reconstructed = self.decoder(encoded)

#print(reconstructed.shape)

return reconstructed

7. Now we instantiate our model, then specify the binary cross entropy as the

loss function and Adam as the optimizer in the compile() method. Then, fit

the model to the training dataset:

model = Autoencoder(filters)

model.compile(loss='binary_crossentropy', optimizer='adam')

loss = model.fit(x_train_noisy,

x_train,

validation_data=(x_test_noisy, x_test),

epochs=max_epochs,

batch_size=batch_size)

[ 363 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!