16.03.2021 Views

Advanced Deep Learning with Keras

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Deep Neural Networks

# orig paper uses SGD but RMSprop works better for DenseNet

model = Model(inputs=inputs, outputs=outputs)

model.compile(loss='categorical_crossentropy',

optimizer=RMSprop(1e-3),

metrics=['accuracy'])

model.summary()

Training the Keras implementation in Listing 2.4.1 for 200 epochs achieves a 93.74%

accuracy vs. the 95.49% as reported in the paper. Data augmentation is used. We

used the same callback functions in ResNet v1/v2 for DenseNet.

For the deeper layers, the growth_rate and depth variables must be changed using

the table on the Python code. However, it will take a substantial amount of time to

train the network at a depth of 250, or 190 as done in the paper. To give us an idea

of training time, each epoch runs for about an hour on a 1060Ti GPU. Though there

is also an implementation of DenseNet in the Keras applications module, it was

trained on ImageNet.

Conclusion

In this chapter, we've presented Functional API as an advanced method for building

complex deep neural network models using Keras. We also demonstrated how the

Functional API could be used to build the multi-input-single-output Y-Network. This

network, when compared to a single branch CNN network, archives better accuracy.

For the rest of the book, we'll find the Functional API indispensable in building more

complex and advanced models. For example, in the next chapter, the Functional API

will enable us to build a modular encoder, decoder, and autoencoder.

We also spent a significant time exploring two important deep networks, ResNet and

DenseNet. Both of these networks have been used not only in classification but also

in other areas, such as segmentation, detection, tracking, generation, and visual/

semantic understanding. We need to remember that it's more important that we

understand the model design decisions in ResNet and DenseNet more closely than

just following the original implementation. In that manner, we'll be able to use the

key concepts of ResNet and DenseNet for our purposes.

[ 68 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!