09.05.2023 Views

pdfcoffee

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Neural Network Foundations with TensorFlow 2.0

model.add(layers.GlobalMaxPooling1D())

model.add(layers.Dense(128, activation='relu'))

model.add(layers.Dropout(0.5))

model.add(layers.Dense(1, activation='sigmoid'))

return model

Now we need to train our model, and this piece of code is very similar to what we

did with MNIST. Let's see:

(X_train, y_train), (X_test, y_test) = load_data()

model = build_model()

model.summary()

model.compile(optimizer = "adam", loss = "binary_crossentropy",

metrics = ["accuracy"]

)

score = model.fit(X_train, y_train,

epochs = EPOCHS,

batch_size = BATCH_SIZE,

validation_data = (X_test, y_test)

)

score = model.evaluate(X_test, y_test, batch_size=BATCH_SIZE)

print("\nTest score:", score[0])

print('Test accuracy:', score[1])

Let's see the network and then run a few iterations:

Figure 36: The results of the network following a few iterations

[ 44 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!