09.05.2023 Views

pdfcoffee

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Chapter 6

The images need to be normalized before we train the network. For better

performance you can even add jitter to the images:

def normalize(input_image, label):

input_image = tf.cast(input_image, tf.float32)

input_image = (input_image / 127.5) - 1

return input_image

The preceding function when applied to images will normalize them in the range

[-1,1]. Let us apply this to our train and test datasets and create a data generator that

will provide images for training in batches:

train_A = train_A.map(normalize, num_parallel_calls=AUTOTUNE).cache().

shuffle(BUFFER_SIZE).batch(BATCH_SIZE)

train_B = train_B.map(normalize, num_parallel_calls=AUTOTUNE).cache().

shuffle(BUFFER_SIZE).batch(BATCH_SIZE)

test_A = test_A.map(normalize, num_parallel_calls=AUTOTUNE).cache().

shuffle(BUFFER_SIZE).batch(BATCH_SIZE)

test_B = test_B.map(normalize, num_parallel_calls=AUTOTUNE).cache().

shuffle(BUFFER_SIZE).batch(BATCH_SIZE)

In the preceding code the argument num_parallel_calls allows one to take benefit

from multiple CPU cores in the system, one should set its value to the number

of CPU cores in your system. If you are not sure, use the AUTOTUNE = tf.data.

experimental.AUTOTUNE value so that TensorFlow dynamically determines the

right number for you.

Before moving ahead with the model definition, let us see the images. Each image is

processed before plotting so that its intensity is normal:

inpA = next(iter(train_A))

inpB = next(iter(train_B))

plt.subplot(121)

plt.title("Train Set A")

plt.imshow(inpA[0]*0.5 + 0.5)

plt.subplot(122)

plt.title("Train Set B")

plt.imshow(inpB[0]*0.5 + 0.5)

[ 219 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!