16.03.2021 Views

Advanced Deep Learning with Keras

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Deep Neural Networks

In this chapter, we'll be examining deep neural networks. These networks have

shown excellent performance in terms of the accuracy of their classification on

more challenging and advanced datasets like ImageNet, CIFAR10 (https://www.

cs.toronto.edu/~kriz/learning-features-2009-TR.pdf), and CIFAR100. For

conciseness, we'll only be focusing on two networks, ResNet [2][4] and DenseNet

[5]. While we will go into much more detail, it's important to take a minute to

introduce these networks.

ResNet introduced the concept of residual learning which enabled it to build

very deep networks by addressing the vanishing gradient problem in deep

convolutional networks.

DenseNet improved the ResNet technique further by allowing every convolution

to have direct access to inputs, and lower layer feature maps. It's also managed

to keep the number of parameters low in deep networks by utilizing both the

Bottleneck and Transition layers.

But why these two models, and not others? Well, since their introduction, there

have been countless models such as ResNeXt [6] and FractalNet [7] which have

been inspired by the technique used by these two networks. Likewise, with an

understanding of both ResNet and DenseNet, we'll be able to use their design

guidelines to build our own models. By using transfer learning, this will also

allow us to take advantage of pretrained ResNet and DenseNet models for our

own purposes. These reasons alone, along with their compatibility with Keras,

make the two models ideal for exploring and complimenting the advanced deep

learning scope of this book.

[ 39 ]

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!