09.05.2023 Views

pdfcoffee

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

TensorFlow 1.x and 2.x

• tf.keras.callbacks.LearningRateScheduler: This feature is used

to dynamically change the learning rate during optimization.

• tf.keras.callbacks.EarlyStopping: This feature is used to interrupt

training when validation performance has stopped improving after a while.

• tf.keras.callbacks.TensorBoard: This feature is used to monitor the

model's behavior using TensorBoard.

For example, we have already used TensorBoard as in this example:

callbacks = [

# Write TensorBoard logs to './logs' directory

tf.keras.callbacks.TensorBoard(log_dir='./logs')

]

model.fit(data, labels, batch_size=256, epochs=100,

callbacks=callbacks,

validation_data=(val_data, val_labels))

Saving a model and weights

After training a model, it can be useful to save the weights in a persistent way. This

is easily achieved with the following code fragment, which saves to TensorFlow's

internal format:

# Save weights to a Tensorflow Checkpoint file

model.save_weights('./weights/my_model')

If you want to save in Keras's format, which is portable across multiple backends,

then use:

# Save weights to a HDF5 file

model.save_weights('my_model.h5', save_format='h5')

Weights are easily loaded with:

# Restore the model's state

model.load_weights(file_path)

In addition to weights, a model can be serialized in JSON with:

json_string = model.to_json() # save

model = tf.keras.models.model_from_json(json_string) # restore

If you prefer, a model can be serialized in YAML with:

yaml_string = model.to_yaml() # save

model = tf.keras.models.model_from_yaml(yaml_string) # restore

[ 68 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!