09.05.2023 Views

pdfcoffee

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Chapter 5

Let's compute the number of training, validation, and testing examples,

then compute the initial accuracy given by the pretrained MobileNetV2:

num_train, num_val, num_test = (

metadata.splits['train'].num_examples*weight/10

for weight in SPLIT_WEIGHTS

)

initial_epochs = 10

steps_per_epoch = round(num_train)//BATCH_SIZE

validation_steps = 4

loss0,accuracy0 = model.evaluate(validation_batches, steps =

validation_steps)

We get an initial accuracy of 50%.

We can now fine-tune the composed network by training for a few iterations

and optimizing the non-frozen layers:

history = model.fit(train_batches,

epochs=initial_epochs,

validation_data=validation_batches)

Thanks to the transfer learning, our network reaches a very high accuracy of 98%

by using Google MobileNetV2 trained on ImageNet. Transfer learning speeds up

training by reusing existing pretrained image classification models and retraining

only the top layer of the network to determine the class that each image belongs to:

Figure 19: Model accuracy using transfer learning

In this section we learned how to use a pretrained model. The next section will

explain where we can find a repository with many models.

Application Zoos with tf.keras and

TensorFlow Hub

One of the nice things about transfer learning is that it is possible to reuse pretrained

networks to save time and resources. There are many collections of ready-to-use

networks out there, but the following two are the most used.

[ 157 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!