09.05.2023 Views

pdfcoffee

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

All the convolutional levels are pretrained, so we freeze them during the training

of the full model:

# i.e. freeze all convolutional InceptionV3 layers

for layer in base_model.layers:

layer.trainable = False

[ 153 ]

Chapter 5

The model is then compiled and trained for a few epochs so that the top layers are

trained. For the sake of simplicity here we are omitting the training code itself:

# compile the model (should be done *after* setting layers to nontrainable)

model.compile(optimizer='rmsprop', loss='categorical_crossentropy')

# train the model on the new data for a few epochs

model.fit_generator(...)

Then we freeze the top layers in inception and fine-tune some inception

layers. In this example we decide to freeze the first 172 layers (this is a tunable

hyperparameter):

# we chose to train the top 2 inception blocks, i.e. we will freeze

# the first 172 layers and unfreeze the rest:

for layer in model.layers[:172]:

layer.trainable = False

for layer in model.layers[172:]:

layer.trainable = True

The model is then recompiled for fine-tuning optimization:

we need to recompile the model for these modifications to take effect

# we use SGD with a low learning rate

from keras.optimizers import SGD

model.compile(optimizer=SGD(lr=0.0001, momentum=0.9),

loss='categorical_crossentropy')

# we train our model again (this time fine-tuning the top 2 inception

# blocks

# alongside the top Dense layers

model.fit_generator(...)

Now we have a new deep network that re-uses a standard Inception-v3 network,

but it is trained on a new domain D via transfer learning. Of course, there are many

fine-tuning parameters for achieving good accuracy. However, we are now re-using

a very large pretrained network as a starting point via transfer learning. In doing

so, we can save the need of training on our machines by reusing what is already

available in tf.keras.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!