16.03.2021 Views

Advanced Deep Learning with Keras

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Introducing Advanced Deep Learning with Keras

Listing 1.3.2 shows the model summary of the proposed network. It requires

a total of 269,322 parameters. This is substantial considering that we have a simple

task of classifying MNIST digits. MLPs are not parameter efficient. The number of

parameters can be computed from Figure 1.3.4 by focusing on how the output of the

perceptron is computed. From input to Dense layer: 784 × 256 + 256 = 200,960. From

first Dense to second Dense: 256 × 256 + 256 = 65,792. From second Dense to the

output layer: 10 × 256 + 10 = 2,570. The total is 269,322.

Listing 1.3.2 shows a summary of an MLP MNIST digit classifier model:

_________________________________________________________________

Layer (type) Output Shape Param #

=================================================================

dense_1 (Dense) (None, 256) 200960

_________________________________________________________________

activation_1 (Activation) (None, 256) 0

_________________________________________________________________

dropout_1 (Dropout) (None, 256) 0

_________________________________________________________________

dense_2 (Dense) (None, 256) 65792

_________________________________________________________________

activation_2 (Activation) (None, 256) 0

_________________________________________________________________

dropout_2 (Dropout) (None, 256) 0

_________________________________________________________________

dense_3 (Dense) (None, 10) 2570

_________________________________________________________________

activation_3 (Activation) (None, 10) 0

=================================================================

Total params: 269,322

Trainable params: 269,322

Non-trainable params: 0

Another way of verifying the network is by calling:

plot_model(model, to_file='mlp-mnist.png', show_shapes=True)

Figure 1.3.9 shows the plot. You'll find that this is similar to the results of summary()

but graphically shows the interconnection and I/O of each layer.

[ 22 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!